diff --git a/.github/ISSUE_TEMPLATE/a-improve-docs.yml b/.github/ISSUE_TEMPLATE/a-improve-docs.yml index 70b173e49a4..c9030bc227b 100644 --- a/.github/ISSUE_TEMPLATE/a-improve-docs.yml +++ b/.github/ISSUE_TEMPLATE/a-improve-docs.yml @@ -5,7 +5,7 @@ body: - type: markdown attributes: value: | - * You can ask questions or submit ideas for the dbt docs in [Discussions](https://github.com/dbt-labs/docs.getdbt.com/discussions) + * You can ask questions or submit ideas for the dbt docs in [Issues](https://github.com/dbt-labs/docs.getdbt.com/issues/new/choose) * Before you file an issue read the [Contributing guide](https://github.com/dbt-labs/docs.getdbt.com#contributing). * Check to make sure someone hasn't already opened a similar [issue](https://github.com/dbt-labs/docs.getdbt.com/issues). diff --git a/.github/ISSUE_TEMPLATE/improve-the-site.yml b/.github/ISSUE_TEMPLATE/improve-the-site.yml index dd585324f89..01ebdea711a 100644 --- a/.github/ISSUE_TEMPLATE/improve-the-site.yml +++ b/.github/ISSUE_TEMPLATE/improve-the-site.yml @@ -5,7 +5,7 @@ body: - type: markdown attributes: value: | - * You can ask questions or submit ideas for the dbt docs in [Discussions](https://github.com/dbt-labs/docs.getdbt.com/discussions) + * You can ask questions or submit ideas for the dbt docs in [Issues](https://github.com/dbt-labs/docs.getdbt.com/issues/new/choose) * Before you file an issue read the [Contributing guide](https://github.com/dbt-labs/docs.getdbt.com#contributing). * Check to make sure someone hasn't already opened a similar [issue](https://github.com/dbt-labs/docs.getdbt.com/issues). diff --git a/.github/labeler.yml b/.github/labeler.yml index 176f1874009..316098eb51c 100644 --- a/.github/labeler.yml +++ b/.github/labeler.yml @@ -3,6 +3,7 @@ developer blog: guides: - website/docs/guides/**/* +- website/docs/quickstarts/**/* content: - website/docs/**/* diff --git a/.github/pull_request_template.md b/.github/pull_request_template.md index 90f4938d2cb..c9b25d3b71c 100644 --- a/.github/pull_request_template.md +++ b/.github/pull_request_template.md @@ -10,7 +10,7 @@ To learn more about the writing conventions used in the dbt Labs docs, see the [ - [ ] Review the [Content style guide](https://github.com/dbt-labs/docs.getdbt.com/blob/current/contributing/content-style-guide.md) and [About versioning](https://github.com/dbt-labs/docs.getdbt.com/blob/current/contributing/single-sourcing-content.md#adding-a-new-version) so my content adheres to these guidelines. - [ ] Add a checklist item for anything that needs to happen before this PR is merged, such as "needs technical review" or "change base branch." diff --git a/.github/workflows/labeler.yml b/.github/workflows/labeler.yml index 7e4bb5c268a..cc231cdcde3 100644 --- a/.github/workflows/labeler.yml +++ b/.github/workflows/labeler.yml @@ -5,8 +5,8 @@ name: "Pull Request Labeler" on: -- pull_request_target - + pull_request_target: + types: [opened] jobs: triage: permissions: diff --git a/README.md b/README.md index da82ab45fd6..c749fedf95a 100644 --- a/README.md +++ b/README.md @@ -17,7 +17,7 @@ Creating an inclusive and equitable environment for our documents is more import We welcome contributions from community members to this repo: - **Fixes**: When you notice an error, you can use the `Edit this page` button at the bottom of each page to suggest a change. - **New documentation**: If you contributed code in [dbt-core](https://github.com/dbt-labs/dbt-core), we encourage you to also write the docs here! Please reach out in the dbt community if you need help finding a place for these docs. -- **Major rewrites**: You can [file an issue](https://github.com/dbt-labs/docs.getdbt.com/issues/new?assignees=&labels=content%2Cimprovement&template=improve-docs.yml) or [start a discussion](https://github.com/dbt-labs/docs.getdbt.com/discussions) to propose ideas for a content area that requires attention. +- **Major rewrites**: You can [file an issue](https://github.com/dbt-labs/docs.getdbt.com/issues/new/choose) to propose ideas for a content area that requires attention. You can use components documented in the [docusaurus library](https://v2.docusaurus.io/docs/markdown-features/). diff --git a/contributing/content-style-guide.md b/contributing/content-style-guide.md index eaa090a00b6..204c5c854f4 100644 --- a/contributing/content-style-guide.md +++ b/contributing/content-style-guide.md @@ -229,7 +229,7 @@ When referring to different sections of the IDE, use the name of the section and People make use of titles in many places like table headers, section headings (such as an H2, H3, or H4), page titles, sidebars, and so much more. -When generating titles or updating them, use sentence case. It sets a more conversational tone to the docs—making the content more approachable and creating a friendly feel. +When generating titles or updating them, use sentence case. It sets a more conversational tone to the docs— making the content more approachable and creating a friendly feel. We've defined five content types you can use when contributing to the docs (as in, writing or authoring). Learn more about title guidelines for [each content type](https://github.com/dbt-labs/docs.getdbt.com/blob/current/contributing/content-types.md). @@ -239,7 +239,7 @@ Placeholder text is something that the user should replace with their own text. Use all capital letters([screaming snake case](https://fission.codes/blog/screaming-snake-case/)) to indicate text that changes in the user interface or that the user needs to supply in a command or code snippet. Avoid surrounding it in brackets or braces, which someone might copy and use, producing an error. -Identify what the user should replace the placeholder text with in the paragraph preceding the code snippet or command. +Identify what the user should replace the placeholder text within the paragraph preceding the code snippet or command. :white_check_mark: The following is an example of configuring a connection to a Redshift database. In your YAML file, you must replace `CLUSTER_ID` with the ID assigned to you during setup: @@ -276,7 +276,7 @@ Guidelines for making lists are: - There are at least two items. - All list items follow a consistent, grammatical structure (like each item starts with a verb, each item begins with a capitalized word, each item is a sentence fragment). - Lists items don't end in commas, semicolons, or conjunctions (like "and", "or"). However, you can use periods if they’re complete sentences. -- Introduce the list with a heading or, if it's within text, as a complete sentence or as a sentence fragment followed by a colon. +- Introduce the list with a heading or, if it's within the text, as a complete sentence or as a sentence fragment followed by a colon. If the list starts getting lengthy and dense, consider presenting the same content in a different format such as a table, as separate subsections, or a new guide. @@ -286,7 +286,7 @@ A bulleted list with introductory text: > A dbt project is a directory of `.sql` and .yml` files. The directory must contain at a minimum: > -> - Models: A model is a single `.sql` file. Each model contains a single `select` statement that either transforms raw data into a dataset that is ready for analytics, or, more often, is an intermediate step in such a transformation. +> - Models: A model is a single `.sql` file. Each model contains a single `select` statement that either transforms raw data into a dataset that is ready for analytics or, more often, is an intermediate step in such a transformation. > - A project file: A `dbt_project.yml` file, which configures and defines your dbt project. A bulleted list with sentence fragments: @@ -307,10 +307,10 @@ A numbered list following an H2 heading: ## Tables Tables provide a great way to present complex information and can help the content be more scannable for users, too. -There are many ways to construct a table, like row spanning and cell splitting. Make sure the content is clear, concise, and presents well on the web page (like avoid awkward word wrapping). +There are many ways to construct a table, such as row spanning and cell splitting. The content should be clear, concise, and presented well on the web page (for example, avoid awkward word wrapping). Guidelines for making tables are: -- Introduce the table with a heading or, if it's within text, as a complete sentence or as a sentence fragment followed by a colon. +- Introduce the table with a heading or, if it's within the text, as a complete sentence or as a sentence fragment followed by a colon. - Use a header row - Use sentence case for all content, including the header row - Content can be complete sentences, sentence fragments, or single words (like `Currency`) @@ -338,7 +338,7 @@ A table following an H3 heading: > | Name | Description | Values | > | -----| ----------- | ------ | > | `-help` | Displays information on how to use the command. | Doesn't take any values. | -> | `-readable` | Print output in human readable format. | | +> | `-readable` | Print output in human-readable format. | | > | `-file` | Print output to file instead of stdout. | Name of the file. | ## Cards @@ -349,7 +349,7 @@ You can configure a card in 2, 3, 4, or 5-column grids. To maintain a good user There won't be many instances where you need to display 4 or 5 cards on the docs site. While we recommend you use 2 or 3-column grids, you can use 4 or 5-column grids in the following scenarios: -- For cards that contain little text and limited to under 15 words. (This is to make sure the text isn't squished) +- For cards that contain little text and are limited to 15 words or less. This is to make sure the text isn't squished. - Always have the `hide_table_of_contents:` frontmatter set to `true` (This hides the right table of contents). Otherwise, the text will appear squished and provide users with a bad experience. @@ -371,16 +371,16 @@ To create cards in markdown, you need to: - Add the props within the card component, including `title`,`body`,`link`,`icon`. - Close out the div by using `` -Refer to the following prop list for detailed explanation and examples: +Refer to the following prop list for detailed explanations and examples: | Prop | Type | Info | Example | | ---- | ---- | ---- | ------- | | `title` | required | The title should be clear and explain an action the user should take or a product/feature. | `title: dbt Cloud IDE` | `body` | required | The body contains the actionable or informative text for the user. You can include `` | +| `icon` | optional but recommended | You can add an icon to the card component by using any icons found in the [icons](https://github.com/dbt-labs/docs.getdbt.com/tree/current/website/static/img/icons) directory.
* Icons are added in .svg format and you must add icons in two locations: website/static/img/icons and website/static/img/icons/white. This is so users can view the icons in dark or light mode on the docs.getdbt.com site. | ` icon="pencil-paper"/>` | -The following is an example of a 4 card column: +The following is an example of a 4-card column: ```
@@ -488,9 +488,16 @@ Avoid ending a sentence with a preposition unless the rewritten sentence would s Product names, trademarks, services, and tools should be written as proper nouns, unless otherwise specified by the company or trademark owner. +As of October 2023, avoid using "dbt CLI" or "CLI" terminology when referring to the dbt Cloud CLI or dbt Core. However, if referring to the command line as a tool, CLI is acceptable. + +dbt officially provides two command line tools for running dbt commands: + +- [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) — This tool allows you to develop locally and execute dbt commands against your dbt Cloud development environment from your local command line. +- [dbt Core](https://github.com/dbt-labs/dbt-core) — This open-source tool is designed for local installation, enabling you to use dbt Core on the command line and communicate with databases through adapters. + ### Terms to use or avoid -Use industry-specific terms and research new/improved terminology. Also refer to the Inclusive Language section of this style guide for inclusive and accessible language and style. +Use industry-specific terms and research new/improved terminology. Also, refer to the Inclusive Language section of this style guide for inclusive and accessible language and style. **DO NOT** use jargon or language familiar to a small subset of readers or assume that your readers understand ALL technical terms. @@ -507,11 +514,13 @@ sign in | log in, login sign up | signup terminal | shell username | login +dbt Cloud CLI | CLI, dbt CLI +dbt Core | CLI, dbt CLI
## Links -Links embedded in documentation are about trust. Users trust that we will lead them to sites or pages related to their reading content. In order to maintain that trust, it's important that links are transparent, up-to-date, and lead to legitimate resources. +Links embedded in the documentation are about trust. Users trust that we will lead them to sites or pages related to their reading content. In order to maintain that trust, it's important that links are transparent, up-to-date, and lead to legitimate resources. ### Internal links diff --git a/website/blog/2021-11-23-how-to-upgrade-dbt-versions.md b/website/blog/2021-11-23-how-to-upgrade-dbt-versions.md index 87b3ea7bd1e..69ca0b2522c 100644 --- a/website/blog/2021-11-23-how-to-upgrade-dbt-versions.md +++ b/website/blog/2021-11-23-how-to-upgrade-dbt-versions.md @@ -62,7 +62,7 @@ As noted above, the project is on 0.16.0 right now. 0.17.2 is the final patch re > > Practically, it also lets you lock in "checkpoints" of known-stable setups. If you need to pause your migration work to deal with an urgent request, you can safely deploy what you've finished so far instead of having a bunch of unrelated half-finished changes. -Review the migration guides to get an initial indication of what changes you might need to make. For example, in [the migration guide for 0.17.0](/guides/migration/versions), there are several significant changes to dbt's functionality, but it's unlikely that all of them will apply to your project. We'll cover this more later. +Review the migration guides to get an initial indication of what changes you might need to make. For example, in [the migration guide for 0.17.0](/docs/dbt-versions/core-upgrade), there are several significant changes to dbt's functionality, but it's unlikely that all of them will apply to your project. We'll cover this more later. ## Step 2: `Add require-dbt-version` to your `dbt_project.yml` file. @@ -126,9 +126,9 @@ In this case, our example project probably has dbt 0.3.0 installed. By reviewing ### Step 5b. Fix errors, then warnings -Obviously, errors that stop you from running your dbt project at all are the most important to deal with. Let's assume that our project used a too-broadly-scoped variable in a macro file, support for which was removed in v0.17. The [migration guide explains what to do instead](/guides/migration/versions), and it's a pretty straightforward fix. +Obviously, errors that stop you from running your dbt project at all are the most important to deal with. Let's assume that our project used a too-broadly-scoped variable in a macro file, support for which was removed in v0.17. The [migration guide explains what to do instead](/docs/dbt-versions/core-upgrade), and it's a pretty straightforward fix. -Once your errors are out of the way, have a look at warnings. For example, 0.17 introduced `config-version: 2` to `dbt_project.yml`. Although it's backwards compatible for now, we know that support for the old version will be removed in a future version of dbt so we might as well deal with it now. Again, the migration guide explains [what we need to do](/guides/migration/versions), and how to take full advantage of the new functionality in the future. +Once your errors are out of the way, have a look at warnings. For example, 0.17 introduced `config-version: 2` to `dbt_project.yml`. Although it's backwards compatible for now, we know that support for the old version will be removed in a future version of dbt so we might as well deal with it now. Again, the migration guide explains [what we need to do](/docs/dbt-versions/core-upgrade), and how to take full advantage of the new functionality in the future. ### Stay focused diff --git a/website/blog/2021-11-29-dbt-airflow-spiritual-alignment.md b/website/blog/2021-11-29-dbt-airflow-spiritual-alignment.md index 0a2ec874a22..fd1a11c41cf 100644 --- a/website/blog/2021-11-29-dbt-airflow-spiritual-alignment.md +++ b/website/blog/2021-11-29-dbt-airflow-spiritual-alignment.md @@ -144,22 +144,22 @@ An analyst will be in the dark when attempting to debug this, and will need to r This can be perfectly ok, in the event your data team is structured for data engineers to exclusively own dbt modeling duties, but that’s a quite uncommon org structure pattern from what I’ve seen. And if you have easy solutions for this analyst-blindness problem, I’d love to hear them. Once the data has been ingested, dbt Core can be used to model it for consumption. Most of the time, users choose to either: -Use the dbt CLI+ [BashOperator](https://registry.astronomer.io/providers/apache-airflow/modules/bashoperator) with Airflow (If you take this route, you can use an external secrets manager to manage credentials externally), or +Use the dbt Core CLI+ [BashOperator](https://registry.astronomer.io/providers/apache-airflow/modules/bashoperator) with Airflow (If you take this route, you can use an external secrets manager to manage credentials externally), or Use the [KubernetesPodOperator](https://registry.astronomer.io/providers/kubernetes/modules/kubernetespodoperator) for each dbt job, as data teams have at places like [Gitlab](https://gitlab.com/gitlab-data/analytics/-/blob/master/dags/transformation/dbt_trusted_data.py#L72) and [Snowflake](https://www.snowflake.com/blog/migrating-airflow-from-amazon-ec2-to-kubernetes/). Both approaches are equally valid; the right one will depend on the team and use case at hand. | | Dependency management | Overhead | Flexibility | Infrastructure Overhead | |---|---|---|---|---| -| dbt CLI + BashOperator | Medium | Low | Medium | Low | +| dbt Core CLI + BashOperator | Medium | Low | Medium | Low | | Kubernetes Pod Operator | Very Easy | Medium | High | Medium | | | | | | | If you have DevOps resources available to you, and your team is comfortable with concepts like Kubernetes pods and containers, you can use the KubernetesPodOperator to run each job in a Docker image so that you never have to think about Python dependencies. Furthermore, you’ll create a library of images containing your dbt models that can be run on any containerized environment. However, setting up development environments, CI/CD, and managing the arrays of containers can mean a lot of overhead for some teams. Tools like the [astro-cli](https://github.com/astronomer/astro-cli) can make this easier, but at the end of the day, there’s no getting around the need for Kubernetes resources for the Gitlab approach. -If you’re just looking to get started or just don’t want to deal with containers, using the BashOperator to call the dbt CLI can be a great way to begin scheduling your dbt workloads with Airflow. +If you’re just looking to get started or just don’t want to deal with containers, using the BashOperator to call the dbt Core CLI can be a great way to begin scheduling your dbt workloads with Airflow. -It’s important to note that whichever approach you choose, this is just a first step; your actual production needs may have more requirements. If you need granularity and dependencies between your dbt models, like the team at [Updater does, you may need to deconstruct the entire dbt DAG in Airflow.](https://www.astronomer.io/guides/airflow-dbt#use-case-2-dbt-airflow-at-the-model-level) If you’re okay managing some extra dependencies, but want to maximize control over what abstractions you expose to your end users, you may want to use the [GoCardlessProvider](https://github.com/gocardless/airflow-dbt), which wraps the BashOperator and dbt CLI. +It’s important to note that whichever approach you choose, this is just a first step; your actual production needs may have more requirements. If you need granularity and dependencies between your dbt models, like the team at [Updater does, you may need to deconstruct the entire dbt DAG in Airflow.](https://www.astronomer.io/guides/airflow-dbt#use-case-2-dbt-airflow-at-the-model-level) If you’re okay managing some extra dependencies, but want to maximize control over what abstractions you expose to your end users, you may want to use the [GoCardlessProvider](https://github.com/gocardless/airflow-dbt), which wraps the BashOperator and dbt Core CLI. #### Rerunning jobs from failure diff --git a/website/blog/2022-02-23-founding-an-AE-team-smartsheet.md b/website/blog/2022-02-23-founding-an-AE-team-smartsheet.md index 89fcb6f5890..954d6dca3b8 100644 --- a/website/blog/2022-02-23-founding-an-AE-team-smartsheet.md +++ b/website/blog/2022-02-23-founding-an-AE-team-smartsheet.md @@ -114,7 +114,7 @@ In the interest of getting a proof of concept out the door (I highly favor focus - Our own Dev, Prod & Publish databases - Our own code repository which we managed independently -- dbt CLI +- dbt Core CLI - Virtual Machine running dbt on a schedule None of us had used dbt before, but we’d heard amazing things about it. We hotly debated the choice between dbt and building our own lightweight stack, and looking back now, I couldn’t be happier with choosing dbt. While there was a learning curve that slowed us down initially, we’re now seeing the benefit of that decision. Onboarding new analysts is a breeze and much of the functionality we need is pre-built. The more we use the tool, the faster we are at using it and the more value we’re gaining from the product. diff --git a/website/blog/2022-05-03-making-dbt-cloud-api-calls-using-dbt-cloud-cli.md b/website/blog/2022-05-03-making-dbt-cloud-api-calls-using-dbt-cloud-cli.md index 91ad1080ce6..2ee774d4f1d 100644 --- a/website/blog/2022-05-03-making-dbt-cloud-api-calls-using-dbt-cloud-cli.md +++ b/website/blog/2022-05-03-making-dbt-cloud-api-calls-using-dbt-cloud-cli.md @@ -12,6 +12,10 @@ date: 2022-05-03 is_featured: true --- +:::info Different from dbt Cloud CLI +This blog explains how to use the `dbt-cloud-cli` Python library to create a data catalog app with dbt Cloud artifacts. This is different from the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation), a tool that allows you to run dbt commands against your dbt Cloud development environment from your local command line. +::: + dbt Cloud is a hosted service that many organizations use for their dbt deployments. Among other things, it provides an interface for creating and managing deployment jobs. When triggered (e.g., cron schedule, API trigger), the jobs generate various artifacts that contain valuable metadata related to the dbt project and the run results. dbt Cloud provides a REST API for managing jobs, run artifacts and other dbt Cloud resources. Data/analytics engineers would often write custom scripts for issuing automated calls to the API using tools [cURL](https://curl.se/) or [Python Requests](https://requests.readthedocs.io/en/latest/). In some cases, the engineers would go on and copy/rewrite them between projects that need to interact with the API. Now, they have a bunch of scripts on their hands that they need to maintain and develop further if business requirements change. If only there was a dedicated tool for interacting with the dbt Cloud API that abstracts away the complexities of the API calls behind an easy-to-use interface… Oh wait, there is: [the dbt-cloud-cli](https://github.com/data-mie/dbt-cloud-cli)! diff --git a/website/blog/2022-07-26-pre-commit-dbt.md b/website/blog/2022-07-26-pre-commit-dbt.md index e0b41d82d0c..fc100897ff0 100644 --- a/website/blog/2022-07-26-pre-commit-dbt.md +++ b/website/blog/2022-07-26-pre-commit-dbt.md @@ -112,7 +112,7 @@ The last step of our flow is to make those pre-commit checks part of the day-to- Adding periodic pre-commit checks can be done in 2 different ways, through CI (Continuous Integration) actions, or as git hooks when running dbt locally -#### a) Adding pre-commit-dbt to the CI flow (works for dbt Cloud and dbt CLI users) +#### a) Adding pre-commit-dbt to the CI flow (works for dbt Cloud and dbt Core users) The example below will assume GitHub actions as the CI engine but similar behavior could be achieved in any other CI tool. @@ -237,9 +237,9 @@ With that information, I could now go back to dbt, document my model customers a We could set up rules that prevent any change to be merged if the GitHub action fails. Alternatively, this action step can be defined as merely informational. -#### b) Installing the pre-commit git hooks (for dbt CLI users) +#### b) Installing the pre-commit git hooks (for dbt Core users) -If we develop locally with the dbt CLI, we could also execute `pre-commit install` to install the git hooks. What it means then is that every time we want to commit code in git, the pre-commit hooks will run and will prevent us from committing if any step fails. +If we develop locally with the dbt Core CLI, we could also execute `pre-commit install` to install the git hooks. What it means then is that every time we want to commit code in git, the pre-commit hooks will run and will prevent us from committing if any step fails. If we want to commit code without performing all the steps of the pre-hook we could use the environment variable SKIP or the git flag `--no-verify` as described [in the documentation](https://pre-commit.com/#temporarily-disabling-hooks). (e.g. we might want to skip the auto `dbt docs generate` locally to prevent it from running at every commit and rely on running it manually from time to time) diff --git a/website/blog/2022-08-31-august-product-update.md b/website/blog/2022-08-31-august-product-update.md index cb4077f3a06..bd9d8ee0b28 100644 --- a/website/blog/2022-08-31-august-product-update.md +++ b/website/blog/2022-08-31-august-product-update.md @@ -22,7 +22,7 @@ You’ll hear more in [Tristan’s keynote](https://coalesce.getdbt.com/agenda/k ## **What's new** -- **dbt Core v1.3 beta:** Do you use Python for analytics? The first beta prerelease of dbt Core v1.3—including support for dbt models written in Python—is [ready to explore](https://docs.getdbt.com/guides/migration/versions/upgrading-to-v1.3)! Check it out, and read more about dbt supported Python models [in our docs](/docs/build/python-models). +- **dbt Core v1.3 beta:** Do you use Python for analytics? The first beta prerelease of dbt Core v1.3—including support for dbt models written in Python—is [ready to explore](https://docs.getdbt.com/docs/dbt-versions/core-upgrade/upgrading-to-v1.3)! Check it out, and read more about dbt supported Python models [in our docs](/docs/build/python-models). - **Technology Partner Program:** We just launched our new [Technology Partner Program](https://www.getdbt.com/blog/dbt-labs-technology-partner-program/) with 40+ friends in the Modern Data Stack to provide consistent support for seamless integrations joint-users can trust. Check our new [dbt Cloud integrations page](http://www.getdbt.com/product/integrations) for what’s available today! - **Single-tenant users:** dbt Cloud v1.1.60 is now available on dbt Cloud Enterprise. diff --git a/website/blog/2023-10-31-to-defer-or-to-clone.md b/website/blog/2023-10-31-to-defer-or-to-clone.md new file mode 100755 index 00000000000..a39fc3ac0b7 --- /dev/null +++ b/website/blog/2023-10-31-to-defer-or-to-clone.md @@ -0,0 +1,118 @@ +--- + +title: To defer or to clone, that is the question +description: "In dbt v1.6, we introduce support for zero-copy cloning via the new dbt clone command. In this blog post, Kshitij will cover what clone is, how it is different from deferral, and when to use each." +slug: to-defer-or-to-clone + +image: /img/blog/2023-10-31-to-defer-or-to-clone/preview.png + +authors: [kshitij_aranke, doug_beatty] + +tags: [analytics craft] +hide_table_of_contents: false + +date: 2023-10-31 +is_featured: true + +--- + +Hi all, I’m Kshitij, a senior software engineer on the Core team at dbt Labs. +One of the coolest moments of my career here thus far has been shipping the new `dbt clone` command as part of the dbt-core v1.6 release. + +However, one of the questions I’ve received most frequently is guidance around “when” to clone that goes beyond [the documentation on “how” to clone](https://docs.getdbt.com/reference/commands/clone). +In this blog post, I’ll attempt to provide this guidance by answering these FAQs: + +1. What is `dbt clone`? +2. How is it different from deferral? +3. Should I defer or should I clone? + +## What is `dbt clone`? + +`dbt clone` is a new command in dbt 1.6 that leverages native zero-copy clone functionality on supported warehouses to **copy entire schemas for free, almost instantly**. + +### How is this possible? + +Well, the warehouse “cheats” by only copying metadata from the `source` schema to the `target` schema; the underlying data remains at rest during this operation. +This metadata includes materialized objects like tables and views, which is why you see a **clone** of these objects in the target schema. + +In computer science jargon, `clone` makes a copy of the pointer from the `source` schema to the underlying data; after the operation there are now two pointers (`source` and `target` schemas) that each point to the same underlying data. + +## How is cloning different from deferral? + +On the surface, cloning and deferral seem similar – **they’re both ways to save costs in the data warehouse.** +They do this by bypassing expensive model re-computations – clone by [eagerly copying](https://en.wikipedia.org/wiki/Evaluation_strategy#Eager_evaluation) an entire schema into the target schema, and defer by [lazily referencing](https://en.wikipedia.org/wiki/Lazy_evaluation) pre-built models in the source schema. + +Let’s unpack this sentence and explore its first-order effects: + +| | defer | clone | +|---------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------| +| **How do I use it?** | Implicit via the `--defer` flag | Explicit via the `dbt clone` command | +| **What are its outputs?** | Doesn't create any objects itself, but dbt might create objects in the target schema if they’ve changed from those in the source schema. | Copies objects from source schema to target schema in the data warehouse, which are persisted after operation is finished. | +| **How does it work?** | Compares manifests between source and target dbt runs and overrides ref to resolve models not built in the target run to point to objects built in the source run. | Uses zero-copy cloning if available to copy objects from source to target schemas, else creates pointer views (`select * from my_model`) | + +These first-order effects lead to the following second-order effects that truly distinguish clone and defer from each other: + +| | defer | clone | +|--------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------| +| **Where can I use objects built in the target schema?** | Only within the context of dbt | Any downstream tool (e.g. BI) | +| **Can I safely modify objects built in the target schema?** | No, since this would modify production data | Yes, cloning is a cheap way to create a sandbox of production data for experimentation | +| **Will data in the target schema drift from data in the source schema?** | No, since deferral will always point to the latest version of the source schema | Yes, since clone is a point-in-time operation | +| **Can I use multiple source schemas at once?** | Yes, defer can dynamically switch between source schemas e.g. ref unchanged models from production and changed models from staging | No, clone copies objects from one source schema to one target schema | + +## Should I defer or should I clone? + +Putting together all the points above, here’s a handy cheat sheet for when to defer and when to clone: + +| | defer | clone | +|---------------------------------------------------------------------------|-------|-------| +| **Save time & cost by avoiding re-computation** | ✅ | ✅ | +| **Create database objects to be available in downstream tools (e.g. BI)** | ❌ | ✅ | +| **Safely modify objects in the target schema** | ❌ | ✅ | +| **Avoid creating new database objects** | ✅ | ❌ | +| **Avoid data drift** | ✅ | ❌ | +| **Support multiple dynamic sources** | ✅ | ❌ | + +To absolutely drive this point home: + +1. If you send someone this cheatsheet by linking to this page, you are deferring to this page +2. If you print out this page and write notes in the margins, you have cloned this page + +## Putting it in practice + +Using the cheat sheet above, let’s explore a few common scenarios and explore whether we should use defer or clone for each: + +1. **Testing staging datasets in BI** + + In this scenario, we want to: + 1. Make a copy of our production dataset available in our downstream BI tool + 2. To safely iterate on this copy without breaking production datasets + + Therefore, we should use **clone** in this scenario + +2. **[Slim CI](https://discourse.getdbt.com/t/how-we-sped-up-our-ci-runs-by-10x-using-slim-ci/2603)** + + In this scenario, we want to: + 1. Refer to production models wherever possible to speed up continuous integration (CI) runs + 2. Only run and test models in the CI staging environment that have changed from the production environment + 3. Reference models from different environments – prod for unchanged models, and staging for modified models + + Therefore, we should use **defer** in this scenario + +3. **[Blue/Green Deployments](https://discourse.getdbt.com/t/performing-a-blue-green-deploy-of-your-dbt-project-on-snowflake/1349)** + + In this scenario, we want to: + 1. Ensure that all tests are always passing on the production dataset, even if that dataset is slightly stale + 2. Atomically rollback a promotion to production if tests aren’t passing across the entire staging dataset + + In this scenario, we can use **clone** to implement a deployment strategy known as blue-green deployments where we build the entire staging dataset and then run tests against it, and only clone it over to production if all tests pass. + + +As a rule of thumb, deferral lends itself better to continuous integration (CI) use cases whereas cloning lends itself better to continuous deployment (CD) use cases. + +## Wrapping Up + +In this post, we covered what `dbt clone` is, how it is different from deferral, and when to use each. Often, they can be used together within the same project in different parts of the deployment lifecycle. + +Thanks for reading, and I look forward to seeing what you build with `dbt clone`. + +*Thanks to [Jason Ganz](https://docs.getdbt.com/author/jason_ganz) and [Gwen Windflower](https://www.linkedin.com/in/gwenwindflower/) for reviewing drafts of this article* diff --git a/website/blog/authors.yml b/website/blog/authors.yml index 2e554ffc814..31d69824ed4 100644 --- a/website/blog/authors.yml +++ b/website/blog/authors.yml @@ -306,6 +306,15 @@ kira_furuichi: name: Kira Furuichi organization: dbt Labs +kshitij_aranke: + image_url: /img/blog/authors/kshitij-aranke.jpg + job_title: Senior Software Engineer + links: + - icon: fa-linkedin + url: https://www.linkedin.com/in/aranke/ + name: Kshitij Aranke + organization: dbt Labs + lauren_benezra: image_url: /img/blog/authors/lauren-benezra.jpeg job_title: Analytics Engineer diff --git a/website/blog/ctas.yml b/website/blog/ctas.yml index 2e3170faae4..6b8c04e0ee3 100644 --- a/website/blog/ctas.yml +++ b/website/blog/ctas.yml @@ -14,4 +14,9 @@ header: Join data practitioners worldwide at Coalesce 2023 subheader: Kicking off on October 16th, both online and in-person (Sydney, London, and San Diego) button_text: Register now - url: https://coalesce.getdbt.com/?utm_medium=internal&utm_source=docs&utm_campaign=q3-2024_coalesce-2023_aw&utm_content=coalesce____&utm_term=all___ \ No newline at end of file + url: https://coalesce.getdbt.com/?utm_medium=internal&utm_source=docs&utm_campaign=q3-2024_coalesce-2023_aw&utm_content=coalesce____&utm_term=all___ +- name: coalesce_2023_catchup + header: Missed Coalesce 2023? + subheader: Watch Coalesce 2023 highlights and full sessions, dbt Labs' annual analytics engineering conference. + button_text: Watch the talks + url: https://www.youtube.com/playlist?list=PL0QYlrC86xQnT3HLh-XgvoTf9F3lbsADf diff --git a/website/blog/metadata.yml b/website/blog/metadata.yml index a5afa86e667..032ab5a760c 100644 --- a/website/blog/metadata.yml +++ b/website/blog/metadata.yml @@ -2,7 +2,7 @@ featured_image: "" # This CTA lives in right sidebar on blog index -featured_cta: "coalesce_2023_signup" +featured_cta: "coalesce_2023_catchup" # Show or hide hero title, description, cta from blog index show_title: true diff --git a/website/dbt-versions.js b/website/dbt-versions.js index 3eff99e7f98..be55c893041 100644 --- a/website/dbt-versions.js +++ b/website/dbt-versions.js @@ -1,8 +1,7 @@ exports.versions = [ { version: "1.7", - EOLDate: "2024-07-31", - isPrerelease: "true" + EOLDate: "2024-10-30", }, { version: "1.6", @@ -27,6 +26,10 @@ exports.versions = [ ] exports.versionedPages = [ + { + "page": "reference/resource-configs/store_failures_as", + "firstVersion": "1.7", + }, { "page": "docs/build/build-metrics-intro", "firstVersion": "1.6", @@ -170,6 +173,10 @@ exports.versionedPages = [ { "page": "reference/resource-configs/grants", "firstVersion": "1.2", + }, + { + "page": "docs/build/saved-queries", + "firstVersion": "1.7", } ] diff --git a/website/docs/community/resources/oss-expectations.md b/website/docs/community/resources/oss-expectations.md index 649a9dea94f..9c916de1240 100644 --- a/website/docs/community/resources/oss-expectations.md +++ b/website/docs/community/resources/oss-expectations.md @@ -4,7 +4,7 @@ title: "Expectations for OSS contributors" Whether it's a dbt package, a plugin, `dbt-core`, or this very documentation site, contributing to the open source code that supports the dbt ecosystem is a great way to level yourself up as a developer, and to give back to the community. The goal of this page is to help you understand what to expect when contributing to dbt open source software (OSS). While we can only speak for our own experience as open source maintainers, many of these guidelines apply when contributing to other open source projects, too. -Have you seen things in other OSS projects that you quite like, and think we could learn from? [Open a discussion on the Developer Hub](https://github.com/dbt-labs/docs.getdbt.com/discussions/new), or start a conversation in the dbt Community Slack (for example: `#community-strategy`, `#dbt-core-development`, `#package-ecosystem`, `#adapter-ecosystem`). We always appreciate hearing from you! +Have you seen things in other OSS projects that you quite like, and think we could learn from? [Open a discussion on the dbt Community Forum](https://discourse.getdbt.com), or start a conversation in the dbt Community Slack (for example: `#community-strategy`, `#dbt-core-development`, `#package-ecosystem`, `#adapter-ecosystem`). We always appreciate hearing from you! ## Principles @@ -51,7 +51,7 @@ An issue could be a bug you’ve identified while using the product or reading t ### Best practices for issues -- Issues are **not** for support / troubleshooting / debugging help. Please [open a discussion on the Developer Hub](https://github.com/dbt-labs/docs.getdbt.com/discussions/new), so other future users can find and read proposed solutions. If you need help formulating your question, you can post in the `#advice-dbt-help` channel in the [dbt Community Slack](https://www.getdbt.com/community/). +- Issues are **not** for support / troubleshooting / debugging help. Please [open a discussion on the dbt Community Forum](https://discourse.getdbt.com), so other future users can find and read proposed solutions. If you need help formulating your question, you can post in the `#advice-dbt-help` channel in the [dbt Community Slack](https://www.getdbt.com/community/). - Always search existing issues first, to see if someone else had the same idea / found the same bug you did. - Many repositories offer templates for creating issues, such as when reporting a bug or requesting a new feature. If available, please select the relevant template and fill it out to the best of your ability. This will help other people understand your issue and respond. diff --git a/website/docs/community/resources/viewpoint.md b/website/docs/community/resources/viewpoint.md index e159c6178a3..5c3f80555c5 100644 --- a/website/docs/community/resources/viewpoint.md +++ b/website/docs/community/resources/viewpoint.md @@ -7,7 +7,7 @@ id: "viewpoint" In 2015-2016, a team of folks at RJMetrics had the opportunity to observe, and participate in, a significant evolution of the analytics ecosystem. The seeds of dbt were conceived in this environment, and the viewpoint below was written to reflect what we had learned and how we believed the world should be different. **dbt is our attempt to address the workflow challenges we observed, and as such, this viewpoint is the most foundational statement of the dbt project's goals.** -The remainder of this document is largely unedited from [the original post](https://blog.getdbt.com/building-a-mature-analytics-workflow/). +The remainder of this document is largely unedited from [the original post](https://getdbt.com/blog/building-a-mature-analytics-workflow). ::: diff --git a/website/docs/dbt-cli/cli-overview.md b/website/docs/dbt-cli/cli-overview.md index 3b96d4637bd..3e44bab801b 100644 --- a/website/docs/dbt-cli/cli-overview.md +++ b/website/docs/dbt-cli/cli-overview.md @@ -3,7 +3,7 @@ title: "CLI overview" description: "Run your dbt project from the command line." --- -dbt Core ships with a command-line interface (CLI) for running your dbt project. The dbt CLI is free to use and available as an [open source project](https://github.com/dbt-labs/dbt-core). +dbt Core ships with a command-line interface (CLI) for running your dbt project. dbt Core and its CLI are free to use and available as an [open source project](https://github.com/dbt-labs/dbt-core). When using the command line, you can run commands and do other work from the current or _working directory_ on your computer. Before running the dbt project from the command line, make sure the working directory is your dbt project directory. For more details, see "[Creating a dbt project](/docs/build/projects)." diff --git a/website/docs/docs/about-setup.md b/website/docs/docs/about-setup.md index 3fb868b8448..ceb34a5ccbb 100644 --- a/website/docs/docs/about-setup.md +++ b/website/docs/docs/about-setup.md @@ -3,11 +3,13 @@ title: About dbt setup id: about-setup description: "About setup of dbt Core and Cloud" sidebar_label: "About dbt setup" +pagination_next: "docs/environments-in-dbt" +pagination_prev: null --- dbt compiles and runs your analytics code against your data platform, enabling you and your team to collaborate on a single source of truth for metrics, insights, and business definitions. There are two options for deploying dbt: -**dbt Cloud** runs dbt Core in a hosted (single or multi-tenant) environment with a browser-based interface. The intuitive UI will aid you in setting up the various components. dbt Cloud comes equipped with turnkey support for scheduling jobs, CI/CD, hosting documentation, monitoring & alerting, and an integrated developer environment (IDE). +**dbt Cloud** runs dbt Core in a hosted (single or multi-tenant) environment with a browser-based interface. The intuitive user interface aids you in setting up the various components. dbt Cloud comes equipped with turnkey support for scheduling jobs, CI/CD, hosting documentation, monitoring, and alerting. It also offers an integrated development environment (IDE) and allows you to develop and run dbt commands from your local command line (CLI) or code editor. **dbt Core** is an open-source command line tool that can be installed locally in your environment, and communication with databases is facilitated through adapters. @@ -19,7 +21,7 @@ To begin configuring dbt now, select the option that is right for you. diff --git a/website/docs/docs/build/about-metricflow.md b/website/docs/docs/build/about-metricflow.md index 68879911597..d76715c46a1 100644 --- a/website/docs/docs/build/about-metricflow.md +++ b/website/docs/docs/build/about-metricflow.md @@ -4,38 +4,38 @@ id: about-metricflow description: "Learn more about MetricFlow and its key concepts" sidebar_label: About MetricFlow tags: [Metrics, Semantic Layer] +pagination_next: "docs/build/join-logic" +pagination_prev: null --- -This guide introduces MetricFlow's fundamental ideas for new users. MetricFlow, which powers the dbt Semantic Layer, helps you define and manage the logic for your company's metrics. It's an opinionated set of abstractions and helps data consumers retrieve metric datasets from a data platform quickly and efficiently. +This guide introduces MetricFlow's fundamental ideas for people new to this feature. MetricFlow, which powers the dbt Semantic Layer, helps you define and manage the logic for your company's metrics. It's an opinionated set of abstractions and helps data consumers retrieve metric datasets from a data platform quickly and efficiently. -:::info +MetricFlow handles SQL query construction and defines the specification for dbt semantic models and metrics. It allows you to define metrics in your dbt project and query them with [MetricFlow commands](/docs/build/metricflow-commands) whether in dbt Cloud or dbt Core. -MetricFlow is a new way to define metrics and one of the key components of the [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl). It handles SQL query construction and defines the specification for dbt semantic models and metrics. +Before you start, consider the following guidelines: -MetricFlow is currently available on dbt v1.6 or higher for all users. dbt Core users can use the MetricFlow CLI to define metrics in their local dbt Core project. However, to experience the power of the universal [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) and query those metrics in downstream tools, you'll need a dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) account. - -::: - -There are a few key principles: - -- **Flexible, but complete** — Ability to create any metric on any data model by defining logic in flexible abstractions. -- **Don't Repeat Yourself (DRY)** — Avoid repetition by allowing metric definitions to be enabled whenever possible. -- **Simple with progressive complexity** — Make MetricFlow approachable by relying on known concepts and structures in data modeling. -- **Performant and efficient** — Allow for performance optimizations in centralized data engineering while still enabling distributed definition and ownership of logic. +- Define metrics in YAML and query them using these [new metric specifications](https://github.com/dbt-labs/dbt-core/discussions/7456). +- You must be on [dbt version](/docs/dbt-versions/upgrade-core-in-cloud) 1.6 or higher to use MetricFlow. +- Use MetricFlow with Snowflake, BigQuery, Databricks, Postgres (dbt Core only), or Redshift. +- Discover insights and query your metrics using the [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) and its diverse range of [available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations). You must have a dbt Cloud account on the [Team or Enterprise plan](https://www.getdbt.com/pricing/). ## MetricFlow -- MetricFlow is a SQL query generation engine that helps you create metrics by constructing appropriate queries for different granularities and dimensions that are useful for various business applications. +MetricFlow is a SQL query generation tool designed to streamline metric creation across different data dimensions for diverse business needs. +- It operates through YAML files, where a semantic graph links language to data. This graph comprises [semantic models](/docs/build/semantic-models) (data entry points) and [metrics](/docs/build/metrics-overview) (functions for creating quantitative indicators). +- MetricFlow is a [BSL package](https://github.com/dbt-labs/metricflow) with code source available, and compatible with dbt version 1.6 and higher. Data practitioners and enthusiasts are highly encouraged to contribute. +- As a part of the dbt Semantic Layer, MetricFlow empowers organizations to define metrics using YAML abstractions. +- To query metric dimensions, dimension values, and validate configurations, use [MetricFlow commands](/docs/build/metricflow-commands). -- It uses YAML files to define a semantic graph, which maps language to data. This graph consists of [semantic models](/docs/build/semantic-models), which serve as data entry points, and [metrics](/docs/build/metrics-overview), which are functions used to create new quantitative indicators. -- MetricFlow is a [BSL package](https://github.com/dbt-labs/metricflow) (code is source available) and available on dbt versions 1.6 and higher. Data practitioners and enthusiasts are highly encouraged to contribute. +**Note** — MetricFlow doesn't support dbt [builtin functions or packages](/reference/dbt-jinja-functions/builtins) at this time, however, support is planned for the future. -- MetricFlow, as a part of the dbt Semantic Layer, allows organizations to define company metrics logic through YAML abstractions, as described in the following sections. +MetricFlow abides by these principles: -- You can install MetricFlow using PyPI as an extension of your [dbt adapter](/docs/supported-data-platforms) in the CLI. To install the adapter, run `pip install "dbt-metricflow[your_adapter_name]"` and add the adapter name at the end of the command. For example, for a Snowflake adapter run `pip install "dbt-metricflow[snowflake]"`. - -- To query metrics dimensions, dimension values, and validate your configurations; install the [MetricFlow CLI](/docs/build/metricflow-cli). +- **Flexibility with completeness**: Define metric logic using flexible abstractions on any data model. +- **DRY (Don't Repeat Yourself)**: Minimize redundancy by enabling metric definitions whenever possible. +- **Simplicity with gradual complexity:** Approach MetricFlow using familiar data modeling concepts. +- **Performance and efficiency**: Optimize performance while supporting centralized data engineering and distributed logic ownership. ### Semantic graph @@ -55,6 +55,8 @@ For a semantic model, there are three main pieces of metadata: * [Dimensions](/docs/build/dimensions) — These are the ways you want to group or slice/dice your metrics. * [Measures](/docs/build/measures) — The aggregation functions that give you a numeric result and can be used to create your metrics. + + ### Metrics Metrics, which is a key concept, are functions that combine measures, constraints, or other mathematical functions to define new quantitative indicators. MetricFlow uses measures and various aggregation types, such as average, sum, and count distinct, to create metrics. Dimensions add context to metrics and without them, a metric is simply a number for all time. You can define metrics in the same YAML files as your semantic models, or create a new file. @@ -112,8 +114,6 @@ group by 1, 2 -> Introducing MetricFlow, a key component of the dbt Semantic Layer 🤩 - simplifying data collaboration and governance. - In the following three example tabs, use MetricFlow to define a semantic model that uses order_total as a metric and a sample schema to create consistent and accurate results — eliminating confusion, code duplication, and streamlining your workflow. diff --git a/website/docs/docs/build/analyses.md b/website/docs/docs/build/analyses.md index cd74c2e052d..af6fb0320f0 100644 --- a/website/docs/docs/build/analyses.md +++ b/website/docs/docs/build/analyses.md @@ -2,11 +2,12 @@ title: "Analyses" description: "Read this tutorial to learn how to use custom analyses when building in dbt." id: "analyses" +pagination_next: null --- ## Overview -dbt's notion of `models` makes it easy for data teams to version control and collaborate on data transformations. Sometimes though, a certain sql statement doesn't quite fit into the mold of a dbt model. These more "analytical" sql files can be versioned inside of your dbt project using the `analysis` functionality of dbt. +dbt's notion of `models` makes it easy for data teams to version control and collaborate on data transformations. Sometimes though, a certain SQL statement doesn't quite fit into the mold of a dbt model. These more "analytical" SQL files can be versioned inside of your dbt project using the `analysis` functionality of dbt. Any `.sql` files found in the `analyses/` directory of a dbt project will be compiled, but not executed. This means that analysts can use dbt functionality like `{{ ref(...) }}` to select from models in an environment-agnostic way. diff --git a/website/docs/docs/build/build-metrics-intro.md b/website/docs/docs/build/build-metrics-intro.md index a6fab61d576..cdac51224ed 100644 --- a/website/docs/docs/build/build-metrics-intro.md +++ b/website/docs/docs/build/build-metrics-intro.md @@ -5,27 +5,28 @@ description: "Learn about MetricFlow and build your metrics with semantic models sidebar_label: Build your metrics tags: [Metrics, Semantic Layer, Governance] hide_table_of_contents: true +pagination_next: "docs/build/sl-getting-started" +pagination_prev: null --- -Use MetricFlow in dbt to centrally define your metrics. As a key component of the [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl), MetricFlow is responsible for SQL query construction and defining specifications for dbt semantic models and metrics. +Use MetricFlow in dbt to centrally define your metrics. As a key component of the [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl), MetricFlow is responsible for SQL query construction and defining specifications for dbt semantic models and metrics. It uses familiar constructs like semantic models and metrics to avoid duplicative coding, optimize your development workflow, ensure data governance for company metrics, and guarantee consistency for data consumers. -Use familiar constructs like semantic models and metrics to avoid duplicative coding, optimize your development workflow, ensure data governance for company metrics, and guarantee consistency for data consumers. -:::info -MetricFlow is currently available on dbt v1.6 or higher and allows users to define metrics in their dbt project whether in dbt Cloud or dbt Core. dbt Core users can use the MetricFlow CLI to define metrics in their local dbt Core project. However, to experience the power of the universal [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) and query those metrics in downstream tools, you'll need a dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) account. - -::: - -Before you start, consider the following guidelines: - -- Define metrics in YAML and query them using these [new metric specifications](https://github.com/dbt-labs/dbt-core/discussions/7456). -- You must be on dbt v1.6 or higher to use MetricFlow. [Upgrade your dbt version](/docs/dbt-versions/upgrade-core-in-cloud) to get started. -- Use MetricFlow with Snowflake, BigQuery, Databricks, Postgres (CLI only), or Redshift. (dbt Cloud Postgres support coming soon) -- Unlock insights and query your metrics using the [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) and its diverse range of [available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations). +MetricFlow allows you to: +- Intuitively define metrics in your dbt project +- Develop from your preferred environment, whether that's the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation), [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud), or [dbt Core](/docs/core/installation) +- Use [MetricFlow commands](/docs/build/metricflow-commands) to query and test those metrics in your development environment +- Harness the true magic of the universal dbt Semantic Layer and dynamically query these metrics in downstream tools (Available for dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) accounts only).
+ + - - +Or in a `schema.yml` file. + + + +```yaml +- models: + - name: ga_sessions + config: + alias: sessions +``` + + + When referencing the `ga_sessions` model above from a different model, use the `ref()` function with the model's _filename_ as usual. For example: @@ -114,13 +127,11 @@ The default implementation of `generate_alias_name` simply uses the supplied `al - -### Managing different behaviors across packages +### Dispatch macro - SQL alias management for databases and dbt packages -See docs on macro `dispatch`: ["Managing different global overrides across packages"](/reference/dbt-jinja-functions/dispatch) +See docs on macro `dispatch`: ["Managing different global overrides across packages"](/reference/dbt-jinja-functions/dispatch#managing-different-global-overrides-across-packages) - ### Caveats @@ -128,20 +139,23 @@ See docs on macro `dispatch`: ["Managing different global overrides across packa Using aliases, it's possible to accidentally create models with ambiguous identifiers. Given the following two models, dbt would attempt to create two views with _exactly_ the same names in the database (ie. `sessions`): -```sql --- models/snowplow_sessions.sql + +```sql {{ config(alias='sessions') }} select * from ... ``` + -```sql --- models/sessions.sql + +```sql select * from ... ``` + + Whichever one of these models runs second would "win", and generally, the output of dbt would not be what you would expect. To avoid this failure mode, dbt will check if your model names and aliases are ambiguous in nature. If they are, you will be presented with an error message like this: ``` diff --git a/website/docs/docs/build/custom-schemas.md b/website/docs/docs/build/custom-schemas.md index ad9fe997483..b20d4130725 100644 --- a/website/docs/docs/build/custom-schemas.md +++ b/website/docs/docs/build/custom-schemas.md @@ -1,6 +1,7 @@ --- title: "Custom schemas" id: "custom-schemas" +pagination_next: "docs/build/custom-databases" --- By default, all dbt models are built in the schema specified in your target. In dbt projects with lots of models, it may be useful to instead build some models in schemas other than your target schema – this can help logically group models together. diff --git a/website/docs/docs/build/custom-target-names.md b/website/docs/docs/build/custom-target-names.md index 4e14f36b784..ac7036de572 100644 --- a/website/docs/docs/build/custom-target-names.md +++ b/website/docs/docs/build/custom-target-names.md @@ -2,7 +2,7 @@ title: "Custom target names" id: "custom-target-names" description: "You can define a custom target name for any dbt Cloud job to correspond to settings in your dbt project." - +pagination_next: null --- ## dbt Cloud Scheduler diff --git a/website/docs/docs/build/derived-metrics.md b/website/docs/docs/build/derived-metrics.md index 2ad1c3e368c..fc7961bbe7f 100644 --- a/website/docs/docs/build/derived-metrics.md +++ b/website/docs/docs/build/derived-metrics.md @@ -124,7 +124,7 @@ You can query any granularity and offset window combination. The following examp alias: bookings_7_days_ago ``` -When you run the query `mf query --metrics d7_booking_change --group-by metric_time__month` for the metric, here's how it's calculated: +When you run the query `dbt sl query --metrics d7_booking_change --group-by metric_time__month` for the metric, here's how it's calculated. For dbt Core, you can use the `mf query` prefix. 1. We retrieve the raw, unaggregated dataset with the specified measures and dimensions at the smallest level of detail, which is currently 'day'. 2. Then, we perform an offset join on the daily dataset, followed by performing a date trunc and aggregation to the requested granularity. diff --git a/website/docs/docs/build/dimensions.md b/website/docs/docs/build/dimensions.md index 49ae9045021..b8679fe11b0 100644 --- a/website/docs/docs/build/dimensions.md +++ b/website/docs/docs/build/dimensions.md @@ -81,8 +81,10 @@ semantic_model: Dimensions have 2 types. This section further explains the definitions and provides examples. -1. [Categorical](#categorical) -1. [Time](#time) +- [Dimensions types](#dimensions-types) + - [Categorical](#categorical) + - [Time](#time) + - [SCD Type II](#scd-type-ii) ### Categorical @@ -102,15 +104,20 @@ dimensions: To use BigQuery as your data platform, time dimensions columns need to be in the datetime data type. If they are stored in another type, you can cast them to datetime using the `expr` property. Time dimensions are used to group metrics by different levels of time, such as day, week, month, quarter, and year. MetricFlow supports these granularities, which can be specified using the `time_granularity` parameter. ::: -Time has additional parameters specified under the `type_params` section. When you query one or more metrics in MetricFlow using the CLI, the default time dimension for a single metric is the primary time dimension, which you can refer to as `metric_time` or use the dimensions' name. +Time has additional parameters specified under the `type_params` section. When you query one or more metrics in MetricFlow using the CLI, the default time dimension for a single metric is the aggregation time dimension, which you can refer to as `metric_time` or use the dimensions' name. You can use multiple time groups in separate metrics. For example, the `users_created` metric uses `created_at`, and the `users_deleted` metric uses `deleted_at`: ```bash -mf query --metrics users_created,users_deleted --dimensions metric_time --order metric_time +# dbt Cloud users +dbt sl query --metrics users_created,users_deleted --dimensions metric_time --order metric_time + +# dbt Core users +mf query --metrics users_created,users_deleted --dimensions metric_time --order metric_time ``` + You can set `is_partition` for time or categorical dimensions to define specific time spans. Additionally, use the `type_params` section to set `time_granularity` to adjust aggregation detail (like daily, weekly, and so on): @@ -121,9 +128,13 @@ Use `is_partition: True` to show that a dimension exists over a specific time wi You can also use `is_partition` for [categorical](#categorical) dimensions as well. -MetricFlow enables metric aggregation during query time. For example, you can aggregate the `messages_per_month` measure. If you originally had a `time_granularity` for the time dimensions `metric_time`, you can specify a yearly granularity for aggregation in your CLI query: +MetricFlow enables metric aggregation during query time. For example, you can aggregate the `messages_per_month` measure. If you originally had a `time_granularity` for the time dimensions `metric_time`, you can specify a yearly granularity for aggregation in your query: ```bash +# dbt Cloud users +dbt sl query --metrics messages_per_month --dimensions metric_time --order metric_time --time-granularity year + +# dbt Core users mf query --metrics messages_per_month --dimensions metric_time --order metric_time --time-granularity year ``` @@ -344,7 +355,11 @@ In the sales tier example, if sales_person_id 456 is Tier 2 from 2022-03-08 onwa The following command or code represents how to return the count of transactions generated by each sales tier per month: -``` +```bash +# dbt Cloud users +dbt sl query --metrics transactions --dimensions metric_time__month,sales_person__tier --order metric_time__month --order sales_person__tier + +# dbt Core users mf query --metrics transactions --dimensions metric_time__month,sales_person__tier --order metric_time__month --order sales_person__tier ``` diff --git a/website/docs/docs/build/enhance-your-code.md b/website/docs/docs/build/enhance-your-code.md new file mode 100644 index 00000000000..5f2d48f6f5a --- /dev/null +++ b/website/docs/docs/build/enhance-your-code.md @@ -0,0 +1,38 @@ +--- +title: "Enhance your code" +description: "Learn how you can enhance your code" +pagination_next: "docs/build/project-variables" +pagination_prev: null +--- + +
+ + + + + +
+
+
+ + + + + +
\ No newline at end of file diff --git a/website/docs/docs/build/enhance-your-models.md b/website/docs/docs/build/enhance-your-models.md new file mode 100644 index 00000000000..46e7fa74353 --- /dev/null +++ b/website/docs/docs/build/enhance-your-models.md @@ -0,0 +1,23 @@ +--- +title: "Enhance your models" +description: "Learn how you can enhance your models" +pagination_next: "docs/build/materializations" +pagination_prev: null +--- + +
+ + + + + +
+
\ No newline at end of file diff --git a/website/docs/docs/build/groups.md b/website/docs/docs/build/groups.md index 7ac5337ba0d..d4fda045277 100644 --- a/website/docs/docs/build/groups.md +++ b/website/docs/docs/build/groups.md @@ -19,7 +19,7 @@ This functionality is new in v1.5. ## About groups -A group is a collection of nodes within a dbt DAG. Groups are named, and every group has an `owner`. They enable intentional collaboration within and across teams by restricting [access to private](/reference/resource-properties/access) models. +A group is a collection of nodes within a dbt DAG. Groups are named, and every group has an `owner`. They enable intentional collaboration within and across teams by restricting [access to private](/reference/resource-configs/access) models. Group members may include models, tests, seeds, snapshots, analyses, and metrics. (Not included: sources and exposures.) Each node may belong to only one group. @@ -94,7 +94,7 @@ select ... ### Referencing a model in a group -By default, all models within a group have the `protected` [access modifier](/reference/resource-properties/access). This means they can be referenced by downstream resources in _any_ group in the same project, using the [`ref`](/reference/dbt-jinja-functions/ref) function. If a grouped model's `access` property is set to `private`, only resources within its group can reference it. +By default, all models within a group have the `protected` [access modifier](/reference/resource-configs/access). This means they can be referenced by downstream resources in _any_ group in the same project, using the [`ref`](/reference/dbt-jinja-functions/ref) function. If a grouped model's `access` property is set to `private`, only resources within its group can reference it. diff --git a/website/docs/docs/build/incremental-models.md b/website/docs/docs/build/incremental-models.md index 07a571cd4db..3a597499f04 100644 --- a/website/docs/docs/build/incremental-models.md +++ b/website/docs/docs/build/incremental-models.md @@ -390,7 +390,7 @@ models: # `DBT_INTERNAL_DEST` and `DBT_INTERNAL_SOURCE` are the standard aliases for the target table and temporary table, respectively, during an incremental run using the merge strategy. ``` -Alternatively, here are the same same configurations configured within a model file: +Alternatively, here are the same configurations configured within a model file: ```sql -- in models/my_incremental_model.sql diff --git a/website/docs/docs/build/jinja-macros.md b/website/docs/docs/build/jinja-macros.md index 44bc85872f5..c5fd6b2e111 100644 --- a/website/docs/docs/build/jinja-macros.md +++ b/website/docs/docs/build/jinja-macros.md @@ -76,7 +76,7 @@ You can recognize Jinja based on the delimiters the language uses, which we refe When used in a dbt model, your Jinja needs to compile to a valid query. To check what SQL your Jinja compiles to: * **Using dbt Cloud:** Click the compile button to see the compiled SQL in the Compiled SQL pane -* **Using the dbt CLI:** Run `dbt compile` from the command line. Then open the compiled SQL file in the `target/compiled/{project name}/` directory. Use a split screen in your code editor to keep both files open at once. +* **Using dbt Core:** Run `dbt compile` from the command line. Then open the compiled SQL file in the `target/compiled/{project name}/` directory. Use a split screen in your code editor to keep both files open at once. ### Macros [Macros](/docs/build/jinja-macros) in Jinja are pieces of code that can be reused multiple times – they are analogous to "functions" in other programming languages, and are extremely useful if you find yourself repeating code across multiple models. Macros are defined in `.sql` files, typically in your `macros` directory ([docs](/reference/project-configs/macro-paths)). diff --git a/website/docs/docs/build/materializations.md b/website/docs/docs/build/materializations.md index 463651ccc77..8846f4bb0c5 100644 --- a/website/docs/docs/build/materializations.md +++ b/website/docs/docs/build/materializations.md @@ -2,6 +2,7 @@ title: "Materializations" description: "Read this tutorial to learn how to use materializations when building in dbt." id: "materializations" +pagination_next: "docs/build/incremental-models" --- ## Overview @@ -68,8 +69,8 @@ When using the `view` materialization, your model is rebuilt as a view on each r * **Pros:** No additional data is stored, views on top of source data will always have the latest records in them. * **Cons:** Views that perform a significant transformation, or are stacked on top of other views, are slow to query. * **Advice:** - * Generally start with views for your models, and only change to another materialization when you're noticing performance problems. - * Views are best suited for models that do not do significant transformation, e.g. renaming, recasting columns. + * Generally start with views for your models, and only change to another materialization when you notice performance problems. + * Views are best suited for models that do not do significant transformation, e.g. renaming, or recasting columns. ### Table When using the `table` materialization, your model is rebuilt as a on each run, via a `create table as` statement. @@ -134,14 +135,15 @@ Materialized views are a combination of a view and a table, and serve use cases * Materialized views operate much like incremental materializations, however they are usually able to be refreshed without manual interference on a regular cadence (depending on the database), forgoing the regular dbt batch refresh required with incremental materializations - * `dbt run` on materialized views correspond to a code deployment, just like views + * `dbt run` on materialized views corresponds to a code deployment, just like views * **Cons:** * Due to the fact that materialized views are more complex database objects, database platforms tend to have less configuration options available, see your database platform's docs for more details * Materialized views may not be supported by every database platform * **Advice:** - * Consider materialized views for use cases where incremental models are sufficient, -but you would like the data platform to manage the incremental logic and refresh. + * Consider materialized views for use cases where incremental models are sufficient, but you would like the data platform to manage the incremental logic and refresh. + +**Note:** `dbt-snowflake` _does not_ support materialized views, it uses Dynamic Tables instead. For details, refer to [Snowflake specific configurations](/reference/resource-configs/snowflake-configs#dynamic-tables). ## Python materializations diff --git a/website/docs/docs/build/measures.md b/website/docs/docs/build/measures.md index ba82a4aa4a5..e06b5046976 100644 --- a/website/docs/docs/build/measures.md +++ b/website/docs/docs/build/measures.md @@ -234,6 +234,15 @@ metrics: We can query the semi-additive metrics using the following syntax: +For dbt Cloud: + +```bash +dbt sl query --metrics mrr_by_end_of_month --dimensions metric_time__month --order metric_time__month +dbt sl query --metrics mrr_by_end_of_month --dimensions metric_time__week --order metric_time__week +``` + +For dbt Core: + ```bash mf query --metrics mrr_by_end_of_month --dimensions metric_time__month --order metric_time__month mf query --metrics mrr_by_end_of_month --dimensions metric_time__week --order metric_time__week diff --git a/website/docs/docs/build/metricflow-cli.md b/website/docs/docs/build/metricflow-commands.md similarity index 61% rename from website/docs/docs/build/metricflow-cli.md rename to website/docs/docs/build/metricflow-commands.md index 2650b2215ae..2386dab4ba2 100644 --- a/website/docs/docs/build/metricflow-cli.md +++ b/website/docs/docs/build/metricflow-commands.md @@ -1,69 +1,145 @@ --- -title: MetricFlow CLI -id: metricflow-cli -description: "Query metrics and metadata in your dbt project with the metricflow cli" -sidebar_label: "MetricFlow CLI commands" +title: MetricFlow commands +id: metricflow-commands +description: "Query metrics and metadata in your dbt project with the MetricFlow commands." +sidebar_label: "MetricFlow commands" tags: [Metrics, Semantic Layer] --- -Once you define metrics in your dbt project, you can query metrics, dimensions, dimension values, and validate your configs using the MetricFlow command line (CLI). +Once you define metrics in your dbt project, you can query metrics, dimensions, and dimension values, and validate your configs using the MetricFlow commands. -# Installation +MetricFlow allows you to define and query metrics in your dbt project in the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation), [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud), or [dbt Core](/docs/core/installation). To experience the power of the universal [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) and dynamically query those metrics in downstream tools, you'll need a dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) account. -You can install the [MetricFlow CLI](https://github.com/dbt-labs/metricflow#getting-started) from [PyPI](https://pypi.org/project/dbt-metricflow/). You need to use `pip` to install the MetricFlow CLI on Windows or Linux operating systems: +MetricFlow is compatible with Python versions 3.8, 3.9, 3.10, and 3.11. -1. Create or activate your virtual environment.`python -m venv venv` + +## MetricFlow + +MetricFlow is a dbt package that allows you to define and query metrics in your dbt project. You can use MetricFlow to query metrics in your dbt project in the dbt Cloud CLI, dbt Cloud IDE, or dbt Core. + +**Note** — MetricFlow commands aren't supported in dbt Cloud jobs yet. However, you can add MetricFlow validations with your git provider (such as GitHub Actions) by installing MetricFlow (`pip install metricflow`). This allows you to run MetricFlow commands as part of your continuous integration checks on PRs. + + + + + +MetricFlow commands are embedded in the dbt Cloud CLI, which means you can immediately run them once you install the dbt Cloud CLI. + +A benefit to using the dbt Cloud is that you won't need to manage versioning — your dbt Cloud account will automatically manage the versioning. + + + + + +:::info +You can create metrics using MetricFlow in the dbt Cloud IDE. However, support for running MetricFlow commands in the IDE will be available soon. +::: + +A benefit to using the dbt Cloud is that you won't need to manage versioning — your dbt Cloud account will automatically manage the versioning. + + + + + + +:::info Use dbt Cloud CLI for semantic layer development + +Use the dbt Cloud CLI for the experience in defining and querying metrics in your dbt project on dbt Cloud or dbt Core with MetricFlow. + +A benefit to using the dbt Cloud is that you won't need to manage versioning — your dbt Cloud account will automatically manage the versioning. +::: + + +You can install [MetricFlow](https://github.com/dbt-labs/metricflow#getting-started) from [PyPI](https://pypi.org/project/dbt-metricflow/). You need to use `pip` to install MetricFlow on Windows or Linux operating systems: + +1. Create or activate your virtual environment `python -m venv venv` 2. Run `pip install dbt-metricflow` + * You can install MetricFlow using PyPI as an extension of your dbt adapter in the command line. To install the adapter, run `pip install "dbt-metricflow[your_adapter_name]"` and add the adapter name at the end of the command. For example, for a Snowflake adapter run `pip install "dbt-metricflow[snowflake]"` + +**Note**, you'll need to manage versioning between dbt Core, your adapter, and MetricFlow. + + + + - * You can install MetricFlow using PyPI as an extension of your dbt adapter in the CLI. To install the adapter, run `pip install "dbt-metricflow[your_adapter_name]"` and add the adapter name at the end of the command. For example, for a Snowflake adapter run `pip install "dbt-metricflow[snowflake]"` +Something to note, MetricFlow `mf` commands return an error if you have a Metafont latex package installed. To run `mf` commands, uninstall the package. -The MetricFlow CLI is compatible with Python versions 3.8, 3.9, 3.10 and 3.11 +## MetricFlow commands -# CLI commands +MetricFlow provides the following commands to retrieve metadata and query metrics. -The MetricFlow CLI provides the following commands to retrieve metadata and query metrics. + + -To execute the commands, use the `mf` prefix before the command name. For example, to list all metrics, run `mf list metrics`: +Use the `dbt sl` prefix before the command name to execute them in dbt Cloud. For example, to list all metrics, run `dbt sl list metrics`. - [`list`](#list) — Retrieves metadata values. - [`list metrics`](#list-metrics) — Lists metrics with dimensions. - [`list dimensions`](#list) — Lists unique dimensions for metrics. - [`list dimension-values`](#list-dimension-values) — List dimensions with metrics. - [`list entities`](#list-entities) — Lists all unique entities. +- [`query`](#query) — Query metrics and dimensions you want to see in the command line interface. Refer to [query examples](#query-examples) to help you get started. + + + + -## List + + +Use the `mf` prefix before the command name to execute them in dbt Core. For example, to list all metrics, run `mf list metrics`. + +- [`list`](#list) — Retrieves metadata values. +- [`list metrics`](#list-metrics) — Lists metrics with dimensions. +- [`list dimensions`](#list) — Lists unique dimensions for metrics. +- [`list dimension-values`](#list-dimension-values) — List dimensions with metrics. +- [`list entities`](#list-entities) — Lists all unique entities. +- [`validate-configs`](#validate-configs) — Validates semantic model configurations. +- [`health-checks`](#health-checks) — Performs data platform health check. +- [`tutorial`](#tutorial) — Dedicated MetricFlow tutorial to help get you started. +- [`query`](#query) — Query metrics and dimensions you want to see in the command line interface. Refer to [query examples](#query-examples) to help you get started. + + + + +### List This command retrieves metadata values related to [Metrics](/docs/build/metrics-overview), [Dimensions](/docs/build/dimensions), and [Entities](/docs/build/entities) values. -## List metrics +### List metrics ```bash -mf list - +dbt sl list # In dbt Cloud +mf list # In dbt Core +``` This command lists the metrics with their available dimensions: ```bash -mf list metrics +dbt sl list metrics # In dbt Cloud + +mf list metrics # In dbt Core + Options: --search TEXT Filter available metrics by this search term --show-all-dimensions Show all dimensions associated with a metric. --help Show this message and exit. ``` -## List dimensions +### List dimensions This command lists all unique dimensions for a metric or multiple metrics. It displays only common dimensions when querying multiple metrics: ```bash -mf list dimensions --metrics +dbt sl list dimensions --metrics # In dbt Cloud + +mf list dimensions --metrics # In dbt Core + Options: - --metrics SEQUENCE List dimensions by given metrics (intersection). Ex. - --metrics bookings,messages + --metrics SEQUENCE List dimensions by given metrics (intersection). Ex. --metrics bookings,messages --help Show this message and exit. ``` @@ -72,7 +148,10 @@ Options: This command lists all dimension values with the corresponding metric: ```bash -mf list dimension-values --metrics --dimension +dbt sl list dimension-values --metrics --dimension # In dbt Cloud + +mf list dimension-values --metrics --dimension # In dbt Core + Options: --dimension TEXT Dimension to query values from [required] --metrics SEQUENCE Metrics that are associated with the dimension @@ -83,24 +162,30 @@ Options: of the data (inclusive) --help Show this message and exit. ``` -## List entities + +### List entities This command lists all unique entities: ```bash -mf list entities --metrics +dbt sl list entities --metrics # In dbt Cloud + +mf list entities --metrics # In dbt Core + Options: - --metrics SEQUENCE List entities by given metrics (intersection). Ex. - --metrics bookings,messages + --metrics SEQUENCE List entities by given metrics (intersection). Ex. --metrics bookings,messages --help Show this message and exit. ``` -## Validate-configs +### Validate-configs This command performs validations against the defined semantic model configurations: ```bash -mf validate-configs +dbt sl validate-configs # In dbt Cloud + +mf validate-configs # In dbt Core + Options: --dw-timeout INTEGER Optional timeout for data warehouse validation steps. Default None. @@ -118,28 +203,34 @@ Options: --help Show this message and exit. ``` -## Health checks +### Health checks This command performs a health check against the data platform you provided in the configs: ```bash -mf health-checks +dbt sl health-checks #in dbt Cloud + +mf health-checks #in dbt Core ``` -## Tutorial +### Tutorial Follow the dedicated MetricFlow tutorial to help you get started: ```bash -mf tutorial +dbt sl tutorial # In dbt Cloud + +mf tutorial # In dbt Core ``` -## Query +### Query Create a new query with MetricFlow, execute that query against the user's data platform, and return the result: ```bash -mf query --metrics --group-by +dbt sl query --metrics --group-by # In dbt Cloud + +mf query --metrics --group-by # In dbt Core Options: @@ -170,8 +261,9 @@ Options: --csv FILENAME Provide filepath for data frame output to csv - --explain In the query output, show the query that was - executed against the data warehouse + --compile (dbt Cloud) In the query output, show the query that was + --explain (dbt Core) executed against the data warehouse + --show-dataflow-plan Display dataflow plan in explain output @@ -186,7 +278,7 @@ Options: ``` -## Query examples +### Query examples The following tabs present various different types of query examples that you can use to query metrics and dimensions. Select the tab that best suits your needs: @@ -198,7 +290,9 @@ Use the example to query metrics by dimension and return the `order_total` metri **Query** ```bash -mf query --metrics order_total --group-by metric_time +dbt sl query --metrics order_total --group-by metric_time # In dbt Cloud + +mf query --metrics order_total --group-by metric_time # In dbt Core ``` **Result** @@ -221,7 +315,9 @@ You can include multiple dimensions in a query. For example, you can group by th **Query** ```bash -mf query --metrics order_total --group-by metric_time, is_food_order +dbt sl query --metrics order_total --group-by metric_time, is_food_order # In dbt Cloud + +mf query --metrics order_total --group-by metric_time, is_food_order # In dbt Core ``` **Result** @@ -248,7 +344,11 @@ You can add order and limit functions to filter and present the data in a readab **Query** ```bash -mf query --metrics order_total --group-by metric_time,is_food_order --limit 10 --order -metric_time +# In dbt Cloud +dbt sl query --metrics order_total --group-by metric_time,is_food_order --limit 10 --order -metric_time + +# In dbt Core +mf query --metrics order_total --group-by metric_time,is_food_order --limit 10 --order -metric_time ``` **Result** @@ -273,7 +373,11 @@ You can further filter the data set by adding a `where` clause to your query. **Query** ```bash -mf query --metrics order_total --group-by metric_time --where "{{Dimension('order_id__is_food_order')}} = True" +# In dbt Cloud +dbt sl query --metrics order_total --group-by metric_time --where "{{ Dimension('order_id__is_food_order') }} = True" + +# In dbt Core +mf query --metrics order_total --group-by metric_time --where "{{ Dimension('order_id__is_food_order') }} = True" ``` **Result** @@ -301,7 +405,12 @@ To filter by time, there are dedicated start and end time options. Using these o **Query** ```bash - mf query --metrics order_total --group-by metric_time,is_food_order --limit 10 --order -metric_time --where "is_food_order = True" --start-time '2017-08-22' --end-time '2017-08-27' + +# In dbt Cloud +dbt sl query --metrics order_total --group-by metric_time,is_food_order --limit 10 --order -metric_time --where "is_food_order = True" --start-time '2017-08-22' --end-time '2017-08-27' + +# In dbt Core +mf query --metrics order_total --group-by metric_time,is_food_order --limit 10 --order -metric_time --where "is_food_order = True" --start-time '2017-08-22' --end-time '2017-08-27' ``` **Result** @@ -331,20 +440,24 @@ The following tabs present additional query examples, like exporting to a CSV. S - + -Add `--explain` to your query to view the SQL generated by MetricFlow. +Add `--compile` (or `--explain` for dbt Core users) to your query to view the SQL generated by MetricFlow. **Query** ```bash - mf query --metrics order_total --group-by metric_time,is_food_order --limit 10 --order -metric_time --where "is_food_order = True" --start-time '2017-08-22' --end-time '2017-08-27' --explain +# In dbt Cloud +dbt sl query --metrics order_total --group-by metric_time,is_food_order --limit 10 --order -metric_time --where "is_food_order = True" --start-time '2017-08-22' --end-time '2017-08-27' --compile + +# In dbt Core +mf query --metrics order_total --group-by metric_time,is_food_order --limit 10 --order -metric_time --where "is_food_order = True" --start-time '2017-08-22' --end-time '2017-08-27' --explain ``` **Result** ```bash ✔ Success 🦄 - query completed after 0.28 seconds -🔎 SQL (remove --explain to see data or add --show-dataflow-plan to see the generated dataflow plan): +🔎 SQL (remove --compile to see data or add --show-dataflow-plan to see the generated dataflow plan): SELECT metric_time , is_food_order @@ -374,6 +487,10 @@ Add the `--csv file_name.csv` flag to export the results of your query to a csv. **Query** ```bash +# In dbt Cloud +dbt sl query --metrics order_total --group-by metric_time,is_food_order --limit 10 --order -metric_time --where "is_food_order = True" --start-time '2017-08-22' --end-time '2017-08-27' --csv query_example.csv + +# In dbt Core mf query --metrics order_total --group-by metric_time,is_food_order --limit 10 --order -metric_time --where "is_food_order = True" --start-time '2017-08-22' --end-time '2017-08-27' --csv query_example.csv ``` @@ -386,14 +503,16 @@ mf query --metrics order_total --group-by metric_time,is_food_order --limit 10 -
-## Time granularity +### Time granularity Optionally, you can specify the time granularity you want your data to be aggregated at by appending two underscores and the unit of granularity you want to `metric_time`, the global time dimension. You can group the granularity by: `day`, `week`, `month`, `quarter`, and `year`. Below is an example for querying metric data at a monthly grain: ```bash -mf query --metrics revenue --group-by metric_time__month +dbt sl query --metrics revenue --group-by metric_time__month # In dbt Cloud + +mf query --metrics revenue --group-by metric_time__month # In dbt Core ``` ## FAQs @@ -403,7 +522,7 @@ mf query --metrics revenue --group-by metric_time__month To add a dimension filter to a where filter, you have to indicate that the filter item is part of your model and use a template wrapper: {{Dimension('primary_entity__dimension_name')}}. -Here's an example query: mf query --metrics order_total --group-by metric_time --where "{{Dimension('order_id__is_food_order')}} = True".

Before using the template wrapper, however, you will need to set up your terminal to escape curly braces for the filter template to work. +Here's an example query: dbt sl query --metrics order_total --group-by metric_time --where "{{Dimension('order_id__is_food_order')}} = True".

Before using the template wrapper, however, you will need to set up your terminal to escape curly braces for the filter template to work.
How to set up your terminal to escape curly braces? @@ -436,3 +555,4 @@ Keep in mind that modifying your shell configuration files can have an impact on
+ diff --git a/website/docs/docs/build/metricflow-time-spine.md b/website/docs/docs/build/metricflow-time-spine.md index 254fa3cc5f0..997d85e38a8 100644 --- a/website/docs/docs/build/metricflow-time-spine.md +++ b/website/docs/docs/build/metricflow-time-spine.md @@ -12,6 +12,8 @@ To create this table, you need to create a model in your dbt project called `met + + ```sql {{ config( @@ -38,8 +40,44 @@ final as ( select * from final ``` + + + + + +```sql +{{ + config( + materialized = 'table', + ) +}} + +with days as ( + + {{ + dbt.date_spine( + 'day', + "to_date('01/01/2000','mm/dd/yyyy')", + "to_date('01/01/2027','mm/dd/yyyy')" + ) + }} + +), + +final as ( + select cast(date_day as date) as date_day + from days +) + +select * from final +``` + + + + + ```sql -- filename: metricflow_time_spine.sql -- BigQuery supports DATE() instead of TO_DATE(). Use this model if you're using BigQuery @@ -61,4 +99,33 @@ final as ( select * from final ``` + + + + + +```sql +-- filename: metricflow_time_spine.sql +-- BigQuery supports DATE() instead of TO_DATE(). Use this model if you're using BigQuery +{{config(materialized='table')}} +with days as ( + {{dbt.date_spine( + 'day', + "DATE(2000,01,01)", + "DATE(2030,01,01)" + ) + }} +), + +final as ( + select cast(date_day as date) as date_day + from days +) + +select * +from final +``` + + + You only need to include the `date_day` column in the table. MetricFlow can handle broader levels of detail, but it doesn't currently support finer grains. diff --git a/website/docs/docs/build/metrics-overview.md b/website/docs/docs/build/metrics-overview.md index e6d875386ee..81af149a7d9 100644 --- a/website/docs/docs/build/metrics-overview.md +++ b/website/docs/docs/build/metrics-overview.md @@ -4,6 +4,7 @@ id: metrics-overview description: "Metrics can be defined in the same or separate YAML files from semantic models within the same dbt project repo." sidebar_label: "Creating metrics" tags: [Metrics, Semantic Layer] +pagination_next: "docs/build/cumulative" --- Once you've created your semantic models, it's time to start adding metrics! Metrics can be defined in the same YAML files as your semantic models, or split into separate YAML files into any other subdirectories (provided that these subdirectories are also within the same dbt project repo) @@ -16,7 +17,7 @@ The keys for metrics definitions are: | `description` | Provide the description for your metric. | Optional | | `type` | Define the type of metric, which can be `simple`, `ratio`, `cumulative`, or `derived`. | Required | | `type_params` | Additional parameters used to configure metrics. `type_params` are different for each metric type. | Required | -| `configs` | Provide the specific configurations for your metric. | Optional | +| `config` | Provide the specific configurations for your metric. | Optional | | `label` | The display name for your metric. This value will be shown in downstream tools. | Required | | `filter` | You can optionally add a filter string to any metric type, applying filters to dimensions, entities, or time dimensions during metric computation. Consider it as your WHERE clause. | Optional | | `meta` | Additional metadata you want to add to your metric. | Optional | @@ -31,10 +32,10 @@ metrics: type: the type of the metric ## Required type_params: ## Required - specific properties for the metric type - configs: here for `enabled` ## Optional + config: here for `enabled` ## Optional label: The display name for your metric. This value will be shown in downstream tools. ## Required filter: | ## Optional - {{ Dimension('entity__name') }} > 0 and {{ Dimension(' entity__another name') }} is not + {{ Dimension('entity__name') }} > 0 and {{ Dimension(' entity__another_name') }} is not null ``` diff --git a/website/docs/docs/build/models.md b/website/docs/docs/build/models.md index e0683158e6d..d10eb5ed01a 100644 --- a/website/docs/docs/build/models.md +++ b/website/docs/docs/build/models.md @@ -2,6 +2,8 @@ title: "About dbt models" description: "Read this tutorial to learn how to use models when building in dbt." id: "models" +pagination_next: "docs/build/sql-models" +pagination_prev: null --- ## Overview diff --git a/website/docs/docs/build/organize-your-outputs.md b/website/docs/docs/build/organize-your-outputs.md new file mode 100644 index 00000000000..ad5efeda1c7 --- /dev/null +++ b/website/docs/docs/build/organize-your-outputs.md @@ -0,0 +1,38 @@ +--- +title: "Organize your outputs" +description: "Learn how you can organize your outputs" +pagination_next: "docs/build/custom-schemas" +pagination_prev: null +--- + +
+ + + + + +
+
+
+ + + + + +
\ No newline at end of file diff --git a/website/docs/docs/build/packages.md b/website/docs/docs/build/packages.md index 74e25262994..8d18a55e949 100644 --- a/website/docs/docs/build/packages.md +++ b/website/docs/docs/build/packages.md @@ -3,7 +3,7 @@ title: "Packages" id: "packages" --- -## What is a package? + Software engineers frequently modularize code into libraries. These libraries help programmers operate with leverage: they can spend more time focusing on their unique business logic, and less time implementing code that someone else has already spent the time perfecting. In dbt, libraries like these are called _packages_. dbt's packages are so powerful because so many of the analytic problems we encountered are shared across organizations, for example: @@ -22,13 +22,19 @@ dbt _packages_ are in fact standalone dbt projects, with models and macros that * Models in the package will be materialized when you `dbt run`. * You can use `ref` in your own models to refer to models from the package. * You can use macros in the package in your own project. +* It's important to note that defining and installing dbt packages is different from [defining and installing Python packages](/docs/build/python-models#using-pypi-packages) -:::note Using Python packages -Defining and installing dbt packages is different from [defining and installing Python packages](/docs/build/python-models#using-pypi-packages). +:::info `dependencies.yml` has replaced `packages.yml` +Starting from dbt v1.6, `dependencies.yml` has replaced `packages.yml`. This file can now contain both types of dependencies: "package" and "project" dependencies. +- "Package" dependencies lets you add source code from someone else's dbt project into your own, like a library. +- "Project" dependencies provide a different way to build on top of someone else's work in dbt. Refer to [Project dependencies](/docs/collaborate/govern/project-dependencies) for more info. +- +You can rename `packages.yml` to `dependencies.yml`, _unless_ you need to use Jinja within your packages specification. This could be necessary, for example, if you want to add an environment variable with a git token in a private git package specification. ::: + ## How do I add a package to my project? 1. Add a file named `dependencies.yml` or `packages.yml` to your dbt project. This should be at the same level as your `dbt_project.yml` file. 2. Specify the package(s) you wish to add using one of the supported syntaxes, for example: @@ -366,3 +372,4 @@ packages: ```
+ diff --git a/website/docs/docs/build/project-variables.md b/website/docs/docs/build/project-variables.md index a69132d6a3b..59d6be49b17 100644 --- a/website/docs/docs/build/project-variables.md +++ b/website/docs/docs/build/project-variables.md @@ -1,6 +1,7 @@ --- title: "Project variables" id: "project-variables" +pagination_next: "docs/build/environment-variables" --- dbt provides a mechanism, [variables](/reference/dbt-jinja-functions/var), to provide data to models for @@ -27,7 +28,7 @@ Jinja is not supported within the `vars` config, and all values will be interpre :::info New in v0.17.0 The syntax for specifying vars in the `dbt_project.yml` file has changed in -dbt v0.17.0. See the [migration guide](/guides/migration/versions) +dbt v0.17.0. See the [migration guide](/docs/dbt-versions/core-upgrade) for more information on these changes. ::: diff --git a/website/docs/docs/build/projects.md b/website/docs/docs/build/projects.md index 0d7dd889fa6..b4b04e3334d 100644 --- a/website/docs/docs/build/projects.md +++ b/website/docs/docs/build/projects.md @@ -1,6 +1,8 @@ --- title: "About dbt projects" id: "projects" +pagination_next: null +pagination_prev: null --- A dbt project informs dbt about the context of your project and how to transform your data (build your data sets). By design, dbt enforces the top-level structure of a dbt project such as the `dbt_project.yml` file, the `models` directory, the `snapshots` directory, and so on. Within the directories of the top-level, you can organize your project in any way that meets the needs of your organization and data pipeline. diff --git a/website/docs/docs/build/saved-queries.md b/website/docs/docs/build/saved-queries.md new file mode 100644 index 00000000000..39a4b2e52fd --- /dev/null +++ b/website/docs/docs/build/saved-queries.md @@ -0,0 +1,43 @@ +--- +title: Saved queries +id: saved-queries +description: "Saved queries are a way to save commonly used queries in MetricFlow. They can be used to save time and avoid writing the same query over and over again." +sidebar_label: "Saved queries" +tags: [Metrics, Semantic Layer] +--- + +:::info Saved queries coming soon +Saved queries isn't currently available in MetricFlow but support is coming soon. +::: + +Saved queries are a way to save commonly used queries in MetricFlow. You can group metrics, dimensions, and filters that are logically related into a saved query. + +To define a saved query, refer to the following specification: + + Parameter | Description | Type | +| --------- | ----------- | ---- | +| `name` | The name of the metric. | Required | +| `description` | The description of the metric. | Optional | +| `metrics` | The metrics included in the saved query. | Required | +| `group_bys` | The value displayed in downstream tools. | Required | +| `where` | Filter applied to the query. | Optional | + +The following is an example of a saved query: + +```yaml +saved_query: + name: p0_booking + description: Booking-related metrics that are of the highest priority. + metrics: + - bookings + - instant_bookings + group_bys: + - TimeDimension('metric_time', 'day') + - Dimension('listing__capacity_latest') + where: + - "{{ Dimension('listing__capacity_latest') }} > 3" +``` + +### FAQs + +* All metrics in a saved query need to use the same dimensions in the `group_by` or `where` clauses. diff --git a/website/docs/docs/build/semantic-models.md b/website/docs/docs/build/semantic-models.md index bb56bd212e6..118e93a26b1 100644 --- a/website/docs/docs/build/semantic-models.md +++ b/website/docs/docs/build/semantic-models.md @@ -6,19 +6,25 @@ keywords: - dbt metrics layer sidebar_label: Semantic models tags: [Metrics, Semantic Layer] +pagination_next: "docs/build/dimensions" --- -Semantic models serve as the foundation for defining data in MetricFlow, which powers the dbt Semantic Layer. You can think of semantic models as nodes in your semantic graph, connected via entities as edges. MetricFlow takes semantic models defined in YAML configuration files as inputs and creates a semantic graph that can be used to query metrics. +Semantic models are the foundation for data definition in MetricFlow, which powers the dbt Semantic Layer: -Each semantic model corresponds to a dbt model in your DAG. Therefore you will have one YAML config for each semantic model in your dbt project. You can create multiple semantic models out of a single dbt model, as long as you give each semantic model a unique name. +- Think of semantic models as nodes connected by entities in a semantic graph. +- MetricFlow uses YAML configuration files to create this graph for querying metrics. +- Each semantic model corresponds to a dbt model in your DAG, requiring a unique YAML configuration for each semantic model. +- You can create multiple semantic models from a single dbt model, as long as you give each semantic model a unique name. +- Configure semantic models in a YAML file within your dbt project directory. +- Organize them under a `metrics:` folder or within project sources as needed. -You can configure semantic models in your dbt project directory in a `YAML` file. Depending on your project structure, you can nest semantic models under a `metrics:` folder or organize them under project sources. + Semantic models have 6 components and this page explains the definitions with some examples: | Component | Description | Type | | --------- | ----------- | ---- | -| [Name](#name) | Unique name for the semantic model | Required | +| [Name](#name) | Choose a unique name for the semantic model. Avoid using double underscores (__) in the name as they're not supported. | Required | | [Description](#description) | Includes important details in the description | Optional | | [Model](#model) | Specifies the dbt model for the semantic model using the `ref` function | Required | | [Defaults](#defaults) | The defaults for the model, currently only `agg_time_dimension` is supported. | Required | @@ -26,6 +32,7 @@ Semantic models have 6 components and this page explains the definitions with so | [Primary Entity](#primary-entity) | If a primary entity exists, this component is Optional. If the semantic model has no primary entity, then this property is required. | Optional | | [Dimensions](#dimensions) | Different ways to group or slice data for a metric, they can be `time` or `categorical` | Required | | [Measures](#measures) | Aggregations applied to columns in your data model. They can be the final metric or used as building blocks for more complex metrics | Optional | +| Label | The display name for your semantic model `node`, `dimension`, `entity`, and/or `measures` | Optional | ## Semantic models components @@ -105,9 +112,32 @@ semantic_models: type: categorical ``` + + +Semantic models support configs in either the schema file or at the project level. + +Semantic model config in `models/semantic.yml`: +```yml +semantic_models: + - name: orders + config: + enabled: true | false + group: some_group +``` + +Semantic model config in `dbt_project.yml`: +```yml +semantic_models: + my_project_name: + +enabled: true | false + +group: some_group +``` + + + ### Name -Define the name of the semantic model. You must define a unique name for the semantic model. The semantic graph will use this name to identify the model, and you can update it at any time. +Define the name of the semantic model. You must define a unique name for the semantic model. The semantic graph will use this name to identify the model, and you can update it at any time. Avoid using double underscores (__) in the name as they're not supported. ### Description @@ -205,8 +235,7 @@ For semantic models with a measure, you must have a [primary time group](/docs/b | `agg` | dbt supports the following aggregations: `sum`, `max`, `min`, `count_distinct`, and `sum_boolean`. | Required | | `expr` | You can either reference an existing column in the table or use a SQL expression to create or derive a new one. | Optional | | `non_additive_dimension` | Non-additive dimensions can be specified for measures that cannot be aggregated over certain dimensions, such as bank account balances, to avoid producing incorrect results. | Optional | -| `create_metric`* | You can create a metric directly from a measure with create_metric: True and specify its display name with create_metric_display_name. | Optional | -_*Coming soon_ +| `create_metric` | You can create a metric directly from a measure with `create_metric: True` and specify its display name with create_metric_display_name. Default is false. | Optional | import SetUpPages from '/snippets/_metrics-dependencies.md'; diff --git a/website/docs/docs/build/simple.md b/website/docs/docs/build/simple.md index 7022ca9d007..1803e952a69 100644 --- a/website/docs/docs/build/simple.md +++ b/website/docs/docs/build/simple.md @@ -4,6 +4,7 @@ id: simple description: "Use simple metrics to directly reference a single measure." sidebar_label: Simple tags: [Metrics, Semantic Layer] +pagination_next: null --- Simple metrics are metrics that directly reference a single measure, without any additional measures involved. They are aggregations over a column in your data platform and can be filtered by one or multiple dimensions. diff --git a/website/docs/docs/build/sl-getting-started.md b/website/docs/docs/build/sl-getting-started.md index c0bf59ae0c2..64cec11c302 100644 --- a/website/docs/docs/build/sl-getting-started.md +++ b/website/docs/docs/build/sl-getting-started.md @@ -8,38 +8,33 @@ meta: api_name: dbt Semantic Layer APIs --- -import InstallMetricFlow from '/snippets/_sl-install-metricflow.md'; import CreateModel from '/snippets/_sl-create-semanticmodel.md'; import DefineMetrics from '/snippets/_sl-define-metrics.md'; import ConfigMetric from '/snippets/_sl-configure-metricflow.md'; import TestQuery from '/snippets/_sl-test-and-query-metrics.md'; +import ConnectQueryAPI from '/snippets/_sl-connect-and-query-api.md'; +import RunProdJob from '/snippets/_sl-run-prod-job.md'; -This getting started page presents a sample workflow to help you create your first metrics in dbt Cloud or the command-line interface (CLI). It uses the [Jaffle shop example project](https://github.com/dbt-labs/jaffle-sl-template) as the project data source and is available for you to use. +This getting started page presents a sample workflow to help you create your first metrics in dbt Cloud or the command line interface (CLI). It uses the [Jaffle shop example project](https://github.com/dbt-labs/jaffle-sl-template) as the project data source and is available for you to use. If you prefer, you can create semantic models and metrics for your own dbt project. This page will guide you on how to: - [Create a semantic model](#create-a-semantic-model) using MetricFlow - [Define metrics](#define-metrics) using MetricFlow -- [Test and query metrics locally](#test-and-query-metrics) using MetricFlow +- [Test and query metrics](#test-and-query-metrics) using MetricFlow - [Run a production job](#run-a-production-job) in dbt Cloud - [Set up dbt Semantic Layer](#set-up-dbt-semantic-layer) in dbt Cloud - [Connect to and query the API](#connect-and-query-api) with dbt Cloud - -MetricFlow allows users to define metrics in their dbt project whether in dbt Cloud or in dbt Core. dbt Core users can use the [MetricFlow CLI](/docs/build/metricflow-cli) to define metrics in their local dbt Core project. +MetricFlow allows you to define metrics in your dbt project and query them whether in dbt Cloud or dbt Core with [MetricFlow commands](/docs/build/metricflow-commands). However, to experience the power of the universal [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) and query those metrics in downstream tools, you'll need a dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) account. ## Prerequisites -- Have an understanding of key concepts in [MetricFlow](/docs/build/about-metricflow), which powers the revamped dbt Semantic Layer. -- Have both your production and development environments running dbt version 1.6 or higher. Refer to [upgrade in dbt Cloud](/docs/dbt-versions/upgrade-core-in-cloud) for more info. -- Use Snowflake, BigQuery, Databricks, Redshift, or Postgres (Postgres available in the CLI only, dbt Cloud support coming soon). -- Create a successful run in the environment where you configure the Semantic Layer. - - **Note:** Semantic Layer currently supports the Deployment environment for querying. (_development querying experience coming soon_) -- Set up the [Semantic Layer API](/docs/dbt-cloud-apis/sl-api-overview) in the integrated tool to import metric definitions. - - **Note:** To access the API and query metrics in downstream tools, you must have a dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) account. dbt Core or Developer accounts can define metrics using [MetricFlow CLI](/docs/build/metricflow-cli) or the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud).
-- Understand [MetricFlow's](/docs/build/about-metricflow) key concepts, which powers the revamped dbt Semantic Layer. +import SetUp from '/snippets/_v2-sl-prerequisites.md'; + + :::tip New to dbt or metrics? Try our [Jaffle shop example project](https://github.com/dbt-labs/jaffle-sl-template) to help you get started! @@ -63,15 +58,7 @@ New to dbt or metrics? Try our [Jaffle shop example project](https://github.com/ ## Run a production job -Before you begin, you must have a dbt Cloud Team or Enterprise [multi-tenant](/docs/cloud/about-cloud/regions-ip-addresses) deployment, hosted in North America (cloud.getdbt.com login URL). - -Once you’ve defined metrics in your dbt project, you can perform a job run in your dbt Cloud deployment environment to materialize your metrics. Only the deployment environment is supported for the dbt Semantic Layer at this moment. - -1. Go to **Deploy** in the menu bar -2. Select **Jobs** to re-run the job with the most recent code in the deployment environment. -3. Your metric should appear as a red node in the dbt Cloud IDE and dbt directed acyclic graphs (DAG). - - + ## Set up dbt Semantic Layer @@ -81,16 +68,7 @@ import SlSetUp from '/snippets/_new-sl-setup.md'; ## Connect and query API -You can query your metrics in a JDBC-enabled tool or use existing first-class integrations with the dbt Semantic Layer. - -You must have a dbt Cloud Team or Enterprise [multi-tenant](/docs/cloud/about-cloud/regions-ip-addresses) deployment, hosted in North America. (Additional region support coming soon) - -- To learn how to use the JDBC or GraphQL API and what tools you can query it with, refer to the
{frontMatter.meta.api_name}.
- - * To authenticate, you need to [generate a service token](/docs/dbt-cloud-apis/service-tokens) with Semantic Layer Only and Metadata Only permissions. - * Refer to the [SQL query syntax](/docs/dbt-cloud-apis/sl-jdbc#querying-the-api-for-metric-metadata) to query metrics using the API. - -- To learn more about the sophisticated integrations that connect to the dbt Semantic Layer, refer to [Available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations) for more info. + ## FAQs @@ -116,3 +94,4 @@ The dbt Semantic Layer is proprietary, however, some components of the dbt Seman - [Build your metrics](/docs/build/build-metrics-intro) - [Get started with the dbt Semantic Layer](/docs/use-dbt-semantic-layer/quickstart-sl) - [Available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations) +- Demo on [how to define and query metrics with MetricFlow](https://www.loom.com/share/60a76f6034b0441788d73638808e92ac?sid=861a94ac-25eb-4fd8-a310-58e159950f5a) diff --git a/website/docs/docs/build/tests.md b/website/docs/docs/build/tests.md index fa78d0df905..75c358155b2 100644 --- a/website/docs/docs/build/tests.md +++ b/website/docs/docs/build/tests.md @@ -163,7 +163,7 @@ Done. PASS=2 WARN=0 ERROR=0 SKIP=0 TOTAL=2 ``` 3. Check out the SQL dbt is running by either: * **dbt Cloud:** checking the Details tab. - * **dbt CLI:** checking the `target/compiled` directory + * **dbt Core:** checking the `target/compiled` directory **Unique test** @@ -241,7 +241,7 @@ where {{ column_name }} is null ## Storing test failures -Normally, a test query will calculate failures as part of its execution. If you set the optional `--store-failures` flag or [`store_failures` config](/reference/resource-configs/store_failures), dbt will first save the results of a test query to a table in the database, and then query that table to calculate the number of failures. +Normally, a test query will calculate failures as part of its execution. If you set the optional `--store-failures` flag, the [`store_failures`](/reference/resource-configs/store_failures), or the [`store_failures_as`](/reference/resource-configs/store_failures_as) configs, dbt will first save the results of a test query to a table in the database, and then query that table to calculate the number of failures. This workflow allows you to query and examine failing records much more quickly in development: diff --git a/website/docs/docs/build/validation.md b/website/docs/docs/build/validation.md index ad485850d23..02ce48729a4 100644 --- a/website/docs/docs/build/validation.md +++ b/website/docs/docs/build/validation.md @@ -12,16 +12,14 @@ These validations ensure that configuration files follow the expected schema, th The code that handles validation [can be found here](https://github.com/dbt-labs/dbt-semantic-interfaces/tree/main/dbt_semantic_interfaces/validations) for those who want to dive deeper into this topic. -## Prerequisites - -- You have installed the [MetricFlow CLI package](https://github.com/dbt-labs/metricflow) ## Validations command -You can run validations from the CLI with the following [MetricFlow commands](/docs/build/metricflow-cli): +You can run validations from dbt Cloud or the command line with the following [MetricFlow commands](/docs/build/metricflow-commands): ```bash -mf validate-configs +dbt sl validate-configs # dbt Cloud users +mf validate-configs # dbt Core users ``` ## Parsing diff --git a/website/docs/docs/cloud/about-cloud-develop-defer.md b/website/docs/docs/cloud/about-cloud-develop-defer.md new file mode 100644 index 00000000000..85bf20880f8 --- /dev/null +++ b/website/docs/docs/cloud/about-cloud-develop-defer.md @@ -0,0 +1,55 @@ +--- +title: Using defer in dbt Cloud +id: about-cloud-develop-defer +description: "Learn how to leverage defer to prod when developing with dbt Cloud." +sidebar_label: "Using defer in dbt Cloud" +pagination_next: "docs/cloud/cloud-cli-installation" +--- + + +[Defer](/reference/node-selection/defer) is a powerful feature that allows developers to only build and run and test models they've edited without having to first run and build all the models that come before them (upstream parents). This is powered by using a production manifest for comparison, and dbt will resolve the `{{ ref() }}` function with upstream production artifacts. + +Both the dbt Cloud IDE and the dbt Cloud CLI allow users to natively defer to production metadata directly in their development workflows, dramatically reducing development time and warehouse spend by preventing unnecessary model builds. + +## Required setup + +- You must select the **[Production environment](/docs/deploy/deploy-environments#set-as-production-environment-beta)** checkbox in the **Environment Settings** page. + - This can be set for one deployment environment per dbt Cloud project. +- You must have a successful job run first. + +When using defer, it compares artifacts from the most recent successful production job, excluding CI jobs. + +### Defer in the dbt Cloud IDE + +To enable defer in the dbt Cloud IDE, toggle the **Defer to production** button on the command bar. Once enabled, dbt Cloud will: + +1. Pull down the most recent manifest from the Production environment for comparison +2. Pass the `--defer` flag to the command (for any command that accepts the flag) + +For example, if you were to start developing on a new branch with [nothing in your development schema](/reference/node-selection/defer#usage), edit a single model, and run `dbt build -s state:modified` — only the edited model would run. Any `{{ ref() }}` functions will point to the production location of the referenced models. + + + +### Defer in dbt Cloud CLI + +One key difference between using `--defer` in the dbt Cloud CLI and the dbt Cloud IDE is that `--defer` is *automatically* enabled in the dbt Cloud CLI for all invocations, compared with production artifacts. You can disable it with the `--no-defer` flag. + +The dbt Cloud CLI offers additional flexibility by letting you choose the source environment for deferral artifacts. You can set a `defer-env-id` key in either your `dbt_project.yml` or `dbt_cloud.yml` file. If you do not provide a `defer-env-id` setting, the dbt Cloud CLI will use artifacts from your dbt Cloud environment marked "Production". + + + + ```yml +defer-env-id: '123456' +``` + + + + + + +```yml +dbt_cloud: + defer-env-id: '123456' +``` + + diff --git a/website/docs/docs/cloud/about-cloud-develop.md b/website/docs/docs/cloud/about-cloud-develop.md new file mode 100644 index 00000000000..9f864ede5ca --- /dev/null +++ b/website/docs/docs/cloud/about-cloud-develop.md @@ -0,0 +1,33 @@ +--- +title: About developing in dbt Cloud +id: about-cloud-develop +description: "Learn how to develop your dbt projects using dbt Cloud." +sidebar_label: "About developing in dbt Cloud" +pagination_next: "docs/cloud/cloud-cli-installation" +hide_table_of_contents: true +--- + +dbt Cloud offers a fast and reliable way to work on your dbt project. It runs dbt Core in a hosted (single or multi-tenant) environment. You can develop in your browser using an integrated development environment (IDE) or in a dbt Cloud-powered command line interface (CLI): + +
+ + + + + +

+ +The following sections provide detailed instructions on setting up the dbt Cloud CLI and dbt Cloud IDE. To get started with dbt development, you'll need a [developer](/docs/cloud/manage-access/seats-and-users) account. For a more comprehensive guide about developing in dbt, refer to our [quickstart guides](/quickstarts). + + +--------- +**Note**: The dbt Cloud CLI and the open-sourced dbt Core are both command line tools that let you run dbt commands. The key distinction is the dbt Cloud CLI is tailored for dbt Cloud's infrastructure and integrates with all its [features](/docs/cloud/about-cloud/dbt-cloud-features). + diff --git a/website/docs/docs/cloud/about-cloud-setup.md b/website/docs/docs/cloud/about-cloud-setup.md index baa2465472e..7b68b52a45a 100644 --- a/website/docs/docs/cloud/about-cloud-setup.md +++ b/website/docs/docs/cloud/about-cloud-setup.md @@ -3,6 +3,8 @@ title: About dbt Cloud setup id: about-cloud-setup description: "Configuration settings for dbt Cloud." sidebar_label: "About dbt Cloud setup" +pagination_next: "docs/dbt-cloud-environments" +pagination_prev: null --- dbt Cloud is the fastest and most reliable way to deploy your dbt jobs. It contains a myriad of settings that can be configured by admins, from the necessities (data platform integration) to security enhancements (SSO) and quality-of-life features (RBAC). This portion of our documentation will take you through the various settings found by clicking on the gear icon in the dbt Cloud UI, including: @@ -11,6 +13,8 @@ dbt Cloud is the fastest and most reliable way to deploy your dbt jobs. It conta - Configuring access to [GitHub](/docs/cloud/git/connect-github), [GitLab](/docs/cloud/git/connect-gitlab), or your own [git repo URL](/docs/cloud/git/import-a-project-by-git-url). - [Managing users and licenses](/docs/cloud/manage-access/seats-and-users) - [Configuring secure access](/docs/cloud/manage-access/about-user-access) +- Configuring the [dbt Cloud IDE](/docs/cloud/about-cloud-develop) +- Installing and configuring the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) These settings are intended for dbt Cloud administrators. If you need a more detailed first-time setup guide for specific data platforms, read our [quickstart guides](/quickstarts). diff --git a/website/docs/docs/cloud/about-cloud/about-cloud-ide.md b/website/docs/docs/cloud/about-cloud/about-cloud-ide.md index f0380f109f8..7643928feec 100644 --- a/website/docs/docs/cloud/about-cloud/about-cloud-ide.md +++ b/website/docs/docs/cloud/about-cloud/about-cloud-ide.md @@ -25,7 +25,7 @@ With the Cloud IDE, you can: For more information, read the complete [Cloud IDE guide](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud). -## Relatd docs +## Related docs - [IDE user interface](/docs/cloud/dbt-cloud-ide/ide-user-interface) - [Tips and tricks](/docs/cloud/dbt-cloud-ide/dbt-cloud-tips) diff --git a/website/docs/docs/cloud/about-cloud/dbt-cloud-features.md b/website/docs/docs/cloud/about-cloud/about-dbt-cloud.md similarity index 78% rename from website/docs/docs/cloud/about-cloud/dbt-cloud-features.md rename to website/docs/docs/cloud/about-cloud/about-dbt-cloud.md index f1d8b32cdb1..71f3175a108 100644 --- a/website/docs/docs/cloud/about-cloud/dbt-cloud-features.md +++ b/website/docs/docs/cloud/about-cloud/about-dbt-cloud.md @@ -4,86 +4,95 @@ id: "dbt-cloud-features" sidebar_label: "dbt Cloud features" description: "Explore dbt Cloud's features and learn why dbt Cloud is the fastest way to deploy dbt" hide_table_of_contents: true +pagination_next: "docs/cloud/about-cloud/architecture" +pagination_prev: null --- -dbt Cloud is the fastest and most reliable way to deploy dbt. Develop, test, schedule, document, and investigate data models all in one browser-based UI. In addition to providing a hosted architecture for running dbt across your organization, dbt Cloud comes equipped with turnkey support for scheduling jobs, CI/CD, hosting documentation, monitoring & alerting, and an integrated development environment (IDE). +dbt Cloud is the fastest and most reliable way to deploy dbt. Develop, test, schedule, document, and investigate data models all in one browser-based UI. + +In addition to providing a hosted architecture for running dbt across your organization, dbt Cloud comes equipped with turnkey support for scheduling jobs, CI/CD, hosting documentation, monitoring and alerting, an integrated development environment (IDE), and allows you to develop and run dbt commands from your local command line interface (CLI) or code editor. dbt Cloud's [flexible plans](https://www.getdbt.com/pricing/) and features make it well-suited for data teams of any size — sign up for your [free 14-day trial](https://www.getdbt.com/signup/)!
+ + + link="/docs/cloud/dbt-cloud-ide/develop-in-the-cloud" + icon="dbt-bit"/> + icon="dbt-bit"/> + icon="dbt-bit"/> + icon="dbt-bit"/> + + + icon="dbt-bit"/> + icon="dbt-bit"/> + icon="dbt-bit"/> + icon="dbt-bit"/> + icon="dbt-bit"/> + icon="dbt-bit"/> - - - - + icon="dbt-bit"/>

*These features are available on [selected plans](https://www.getdbt.com/pricing/). diff --git a/website/docs/docs/cloud/about-cloud/architecture.md b/website/docs/docs/cloud/about-cloud/architecture.md index 4ad016f4007..52614f0cbcd 100644 --- a/website/docs/docs/cloud/about-cloud/architecture.md +++ b/website/docs/docs/cloud/about-cloud/architecture.md @@ -42,7 +42,7 @@ Some data warehouse providers offer advanced security features that can be lever ### Git sync -dbt Cloud can sync with a variety of git providers, including [Github](/docs/cloud/git/connect-github), [Gitlab](/docs/cloud/git/connect-gitlab), and [Azure DevOps](/docs/cloud/git/connect-azure-devops) within its integrated development environment ([IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud). Communication takes place over HTTPS rather than SSH and is protected using the TLS 1.2 protocol for data in transit. +dbt Cloud can sync with a variety of git providers, including [Github](/docs/cloud/git/connect-github), [Gitlab](/docs/cloud/git/connect-gitlab), and [Azure DevOps](/docs/cloud/git/connect-azure-devops) within its integrated development environment ([IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud)). Communication takes place over HTTPS rather than SSH and is protected using the TLS 1.2 protocol for data in transit. The git repo information is stored on dbt Cloud servers to make it accessible during the IDE sessions. When the git sync is disabled, you must [contact support](mailto:support@getdbt.com) to request the deletion of the synced data. diff --git a/website/docs/docs/cloud/about-cloud/browsers.md b/website/docs/docs/cloud/about-cloud/browsers.md index 2fc5a8b4b4d..12665bc7b72 100644 --- a/website/docs/docs/cloud/about-cloud/browsers.md +++ b/website/docs/docs/cloud/about-cloud/browsers.md @@ -2,6 +2,7 @@ title: "Supported browsers" id: "browsers" description: "dbt Cloud supports the latest browsers like Chrome and Firefox." +pagination_next: null --- To have the best experience with dbt Cloud, we recommend using the latest versions of the following browsers: diff --git a/website/docs/docs/cloud/about-cloud/regions-ip-addresses.md b/website/docs/docs/cloud/about-cloud/regions-ip-addresses.md index caeb0203a5e..4fcabbb3585 100644 --- a/website/docs/docs/cloud/about-cloud/regions-ip-addresses.md +++ b/website/docs/docs/cloud/about-cloud/regions-ip-addresses.md @@ -25,3 +25,13 @@ dbt Cloud is [hosted](/docs/cloud/about-cloud/architecture) in multiple regions There are two ways to view your dbt Cloud IP addresses: - If no projects exist in the account, create a new project, and the IP addresses will be displayed during the **Configure your environment** steps. - If you have an existing project, navigate to **Account Settings** and ensure you are in the **Projects** pane. Click on a project name, and the **Project Settings** window will open. Locate the **Connection** field and click on the name. Scroll down to the **Settings**, and the first text block lists your IP addresses. + +### Static IP addresses + +dbt Cloud, like many cloud services, relies on underlying AWS cloud infrastructure for operations. While we can offer static URLs for access, we cannot provide a list of IP addresses to configure connections due to the nature of AWS cloud services. + +* Dynamic IP addresses — dbt Cloud infrastructure uses Amazon Web Services (AWS). dbt Cloud offers static URLs for streamlined access, but the dynamic nature of cloud services means the underlying IP addresses change occasionally. AWS manages the IP ranges and may change them according to their operational and security needs. + +* Using hostnames for consistent access — To ensure uninterrupted access, we recommend that you dbt Cloud services using hostnames. Hostnames provide a consistent reference point, regardless of any changes in underlying IP addresses. We are aligning with an industry-standard practice employed by organizations such as Snowflake. + +* Optimizing VPN connections &mdash: You should integrate a proxy alongside VPN for users who leverage VPN connections. This strategy enables steady IP addresses for your connections, facilitating smooth traffic flow through the VPN and onward to dbt Cloud. By employing a proxy and a VPN, you can direct traffic through the VPN and then to dbt Cloud. It's crucial to set up the proxy if you need to integrate with additional services. diff --git a/website/docs/docs/cloud/billing.md b/website/docs/docs/cloud/billing.md index 61251f6e41d..1d71d33e9a1 100644 --- a/website/docs/docs/cloud/billing.md +++ b/website/docs/docs/cloud/billing.md @@ -3,28 +3,81 @@ title: "Billing" id: billing description: "dbt Cloud billing information." sidebar_label: Billing +pagination_next: null +pagination_prev: null --- dbt Cloud offers a variety of [plans and pricing](https://www.getdbt.com/pricing/) to fit your organization’s needs. With flexible billing options that appeal to large enterprises and small businesses and [server availability](/docs/cloud/about-cloud/regions-ip-addresses) worldwide, dbt Cloud is the fastest and easiest way to begin transforming your data. ## How does dbt Cloud pricing work? -As a customer, you pay for the number of seats you have and the amount of usage consumed each month. Usage is based on the number of Successful Models Built, and seats are billed primarily on the amount of Developer licenses purchased. All billing computations are conducted in Coordinated Universal Time (UTC). +As a customer, you pay for the number of seats you have and the amount of usage consumed each month. Seats are billed primarily on the amount of Developer and Read licenses purchased. Usage is based on the number of [Successful Models Built](#what-counts-as-a-successful-model-built) and, if purchased and used, Semantic Layer Query Units subject to reasonable usage. All billing computations are conducted in Coordinated Universal Time (UTC). + +### What counts as a seat license? + +There are three types of possible seat licenses: + +* **Developer** — for roles and permissions that require interaction with the dbt Cloud environment day-to-day. +* **Read-Only** — for access to view certain documents and reports. +* **IT** — for access to specific features related to account management (for example, configuring git integration). ### What counts as a Successful Model Built? -dbt Cloud considers a Successful Model Built as any model that is successfully built via a run through dbt Cloud’s orchestration functionality in a dbt Cloud deployment environment. Models are counted when built and run. This includes any jobs run via dbt Cloud's scheduler, CI builds (jobs triggered by pull requests), runs kicked off via the dbt Cloud API, and any successor dbt Cloud tools with similar functionality. This also includes models that are successfully built even when a run may fail to complete. For example, you may have a job that contains 100 models and on one of its runs, 51 models are successfully built and then the job fails. In this situation, only 51 models would be counted. +dbt Cloud considers a Successful Model Built as any model that is successfully built via a run through dbt Cloud’s orchestration functionality in a dbt Cloud deployment environment. Models are counted when built and run. This includes any jobs run via dbt Cloud's scheduler, CI builds (jobs triggered by pull requests), runs kicked off via the dbt Cloud API, and any successor dbt Cloud tools with similar functionality. This also includes models that are successfully built even when a run may fail to complete. For example, you may have a job that contains 100 models and on one of its runs, 51 models are successfully built and then the job fails. In this situation, only 51 models would be counted. Any models built in a dbt Cloud development environment (for example, via the IDE) do not count towards your usage. Tests, seeds, ephemeral models, and snapshots also do not count. +| What counts towards Successful Models Built | | +|---------------------------------------------|---------------------| +| View | ✅ | +| Table | ✅ | +| Incremental | ✅ | +| Ephemeral Models | ❌ | +| Tests | ❌ | +| Seeds | ❌ | +| Snapshots | ❌ | -### What counts as a seat license? +### What counts as a Query Unit?​ -There are three types of possible seat licenses: +The dbt Semantic Layer, powered by MetricFlow, measures usage in distinct query units. Every successful request you make to render or run SQL to the Semantic Layer API counts as at least one query unit, even if no data is returned. If the query calculates or renders SQL for multiple metrics, each calculated metric will be counted as a query unit. +If a request to run a query is not executed successfully in the data platform or if a query results in an error without completion, it is not counted as a query unit. Requests for metadata from the Semantic Layer are also not counted as query units. -* **Developer** — for roles and permissions that require interaction with the dbt Cloud environment day-to-day. -* **Read-Only** — for access to view certain documents and reports. -* **IT** — for access to specific features related to account management (for example, configuring git integration). +Examples of query units include: + +Querying one metric, grouping by one dimension → 1 query unit + +```shell +dbt sl query --metrics revenue --group_by metric_time +``` +Querying one metric, grouping by two dimensions → 1 query unit + +```shell +dbt sl query --metrics revenue --group_by metric_time,user__country +``` + +Querying two metrics, grouping by two dimensions → 2 query units + +```shell +dbt sl query --metrics revenue,gross_sales --group_by metric_time,user__country +``` + +Running an explain for one metric → 1 query unit + +```shell +dbt sl query --metrics revenue --group_by metric_time --explain +``` + +Running an explain for two metrics → 2 query units + +```shell +dbt sl query --metrics revenue,gross_sales --group_by metric_time --explain +``` + +Running a query for only dimensions such as dimension_values or a query with no metrics → 1 query unit + +```shell +bt sl list dimension-values --dimension user__country +``` ### Viewing usage in the product @@ -59,7 +112,7 @@ All included successful models built numbers above reflect our most current pric Team customers pay monthly via credit card for seats and usage, and accounts include 15,000 models monthly. Seats are charged upfront at the beginning of the month. If you add seats during the month, seats will be prorated and charged on the same day. Seats removed during the month will be reflected on the next invoice and are not eligible for refunds. You can change the credit card information and the number of seats from the billings section anytime. Accounts will receive one monthly invoice that includes the upfront charge for the seats and the usage charged in arrears from the previous month. -Usage is calculated and charged in arrears for the previous month. If you exceed 15,000 models in any month, you will be billed for additional usage on your next invoice. Additional use is billed at the rates on our [pricing page](https://www.getdbt.com/pricing). +Usage is calculated and charged in arrears for the previous month. If you exceed 15,000 models in any month, you will be billed for additional usage on your next invoice. Additional usage is billed at the rates on our [pricing page](https://www.getdbt.com/pricing). Included models that are not consumed do not roll over to future months. You can estimate your bill with a simple formula: @@ -68,15 +121,22 @@ Included models that are not consumed do not roll over to future months. You can All included successful models built numbers above reflect our most current pricing and packaging. Based on your usage terms when you signed up for the Team Plan, the included model entitlements may be different from what’s reflected above. -:::note Legacy pricing plans - -Customers who purchased the dbt Cloud Team plan before August 11, 2023, remain on a legacy pricing plan as long as their account is in good standing. The legacy pricing plan is based on seats and includes unlimited models subject to reasonable use. dbt Labs may institute use limits if reasonable use is exceeded. Additional features, upgrades, or updates may be subject to separate charges. Any changes to your current plan pricing will be communicated in advance according to our Terms of Use. +### Enterprise plan billing + +As an Enterprise customer, you pay annually via invoice, monthly in arrears for additional usage (if applicable), and may benefit from negotiated usage rates. Please refer to your order form or contract for your specific pricing details, or [contact the account team](https://www.getdbt.com/contact-demo) with any questions. + +### Legacy plans + +Customers who purchased the dbt Cloud Team plan before August 11, 2023, remain on a legacy pricing plan as long as your account is in good standing. The legacy pricing plan is based on seats and includes unlimited models, subject to reasonable use. + +:::note Legacy Semantic Layer + +For customers using the legacy Semantic Layer with dbt_metrics package, this product will be deprecated in December 2023. Legacy users may choose to upgrade at any time to the revamped version, Semantic Layer powered by MetricFlow. The revamped version is available to most customers (see [prerequisites](/docs/use-dbt-semantic-layer/quickstart-sl#prerequisites)) for a limited time on a free trial basis, subject to reasonable use. ::: -### Enterprise plan billing +dbt Labs may institute use limits if reasonable use is exceeded. Additional features, upgrades, or updates may be subject to separate charges. Any changes to your current plan pricing will be communicated in advance according to our Terms of Use. -As an Enterprise customer, you pay annually via invoice, monthly in arrears for additional usage (if applicable), and may benefit from negotiated usage rates. Please refer to your order form or contract for your specific pricing details, or [contact the account team](https://www.getdbt.com/contact-demo) with any questions. ## Managing usage @@ -191,3 +251,10 @@ _Yes. Your dbt Cloud account will be upgraded without impacting your existing pr * How do I determine the right plan for me? _The best option is to consult with our sales team. They'll help you figure out what is right for your needs. We also offer a free two-week trial on the Team plan._ + +* What are the Semantic Layer trial terms? +_Team and Enterprise customers can sign up for a free trial of the dbt Semantic Layer, powered by MetricFlow, for use of up to 1,000 query units per month. The trial will be available at least through January 2024. dbt Labs may extend the trial period in its sole discretion. During the trial period, we may reach out to discuss pricing options or ask for feedback. At the end of the trial, free access may be removed and a purchase may be required to continue use. dbt Labs reserves the right to change limits in a free trial or institute pricing when required or at any time in its sole discretion._ + +* What is the reasonable use limitation for the dbt Semantic Layer powered by MetricFlow during the trial? +_Each account will be limited to 1,000 Queried Metrics per month during the trial period and may be changed at the sole discretion of dbt Labs._ + diff --git a/website/docs/docs/cloud/cloud-cli-installation.md b/website/docs/docs/cloud/cloud-cli-installation.md index 68a8ef365d6..20af8da314b 100644 --- a/website/docs/docs/cloud/cloud-cli-installation.md +++ b/website/docs/docs/cloud/cloud-cli-installation.md @@ -1,110 +1,276 @@ --- -title: Installing the dbt Cloud CLI (Alpha) +title: Install dbt Cloud CLI +sidebar_label: "Install dbt Cloud CLI" id: cloud-cli-installation description: "Instructions for installing and configuring dbt Cloud CLI" +pagination_next: "docs/cloud/configure-cloud-cli" --- -:::warning Alpha functionality +import CloudCLIFlag from '/snippets/_cloud-cli-flag.md'; -The following installation instructions are for the dbt Cloud CLI, currently in Alpha (actively in development and being tested). + -These instructions are not intended for general audiences at this time. -::: +dbt Cloud natively supports developing using a command line (CLI), empowering team members to contribute with enhanced flexibility and collaboration. The dbt Cloud CLI allows you to run dbt commands against your dbt Cloud development environment from your local command line. -## Installing dbt Cloud CLI +dbt commands are run against dbt Cloud's infrastructure and benefit from: -### Install and update with Brew on MacOS (recommended) +* Secure credential storage in the dbt Cloud platform. +* [Automatic deferral](/docs/cloud/about-cloud-develop-defer) of build artifacts to your Cloud project's production environment. +* Speedier, lower-cost builds. +* Support for dbt Mesh ([cross-project `ref`](/docs/collaborate/govern/project-dependencies)), +* Significant platform improvements, to be released over the coming months. -1. Install the dbt Cloud CLI: -```bash -brew tap dbt-labs/dbt-cli -brew install dbt-cloud-cli -``` +## Prerequisites +The dbt Cloud CLI is available in all [deployment regions](/docs/cloud/about-cloud/regions-ip-addresses) and for both multi-tenant and single-tenant accounts (Azure single-tenant not supported at this time). -2. Verify the installation by requesting your homebrew installation path (not your dbt core installs). If the `which dbt` command returns nothing, then you should modify your PATH in `~.zshrc` or create an alias. +- Ensure you are using dbt version 1.5 or higher. Refer to [dbt Cloud versions](/docs/dbt-versions/upgrade-core-in-cloud) to upgrade. +- Avoid using SSH tunneling for [Postgres and Redshift](/docs/cloud/connect-data-platform/connect-redshift-postgresql-alloydb) connections. +- Avoid using [PrivateLink](/docs/cloud/secure/about-privatelink). -```bash -which dbt -dbt --help -``` +## Install dbt Cloud CLI -### Manually install (Windows and Linux) +You can install the dbt Cloud CLI on the command line by using one of these methods. -1. Download the latest release for your platform from [GitHub](https://github.com/dbt-labs/dbt-cli/releases). -2. Add the `dbt` executable to your path. -3. Move to a directory with a dbt project, and create a `dbt_cloud.yml` file containing your `project-id` from dbt Cloud. -4. Invoke `dbt --help` from your terminal to see a list of supported commands. +
+View a video tutorial for a step-by-step guide to installing -#### Updating your dbt Cloud installation (Windows + Linux) + -Follow the same process in [Installing dbt Cloud CLI](#manually-install-windows-only) and replace the existing `dbt` executable with the new one. You should not have to go through the security steps again. +
-## Setting up the CLI + + + -The following instructions are for setting up the dbt Cloud CLI. +Before you begin, make sure you have [Homebrew installed](http://brew.sh/) in your code editor or command line terminal. Refer to the [FAQs](#faqs) if your operating system runs into path conflicts. -1. Ensure that you have created a project in [dbt Cloud](https://cloud.getdbt.com/). +1. Run the following command to verify that you don't already have dbt Core installed: + + ```bash + which dbt + ``` + - This should return a `dbt not found`. If the dbt help text appears, use `pip uninstall dbt` to remove dbt Core from your machine.
+ +2. Install the dbt Cloud CLI with Homebrew: -2. Ensure that your personal [development credentials](https://cloud.getdbt.com/settings/profile/credentials) are set on the project. + - First, remove the dbt-labs tap, the separate repository for packages, from Homebrew. This prevents Homebrew from installing packages from that repository: + ```bash + brew untap dbt-labs/dbt + - Then run `brew tap` to add and install the dbt Cloud CLI as a package: + ```bash + brew tap dbt-labs/dbt-cli + ``` + - Lastly, install the dbt Cloud CLI with Homebrew: + ```bash + brew install dbt + ``` -3. Navigate to [your profile](https://cloud.getdbt.com/settings/profile) and enable the **Beta** flag under **Experimental Features.** +3. Verify your installation by running `dbt --help` in the command line. If you see the following output, your installation is correct: + ```bash + The dbt Cloud CLI - an ELT tool for running SQL transformations and data models in dbt Cloud... + ``` -4. Create an environment variable with your [dbt Cloud API key](https://cloud.getdbt.com/settings/profile#api-access): + If you don't see this output, check that you've deactivated pyenv or venv and don't have a global dbt version installed. + + * Note that you no longer need to run the `dbt deps` command when your environment starts. This step was previously required during initialization. However, you should still run `dbt deps` if you make any changes to your `packages.yml` file. -```bash -vi ~/.zshrc +4. After you've verified the installation, [configure](/docs/cloud/configure-cloud-cli) the dbt Cloud CLI for your dbt Cloud project and use it to run [dbt commands](/reference/dbt-commands) similar to dbt Core. For example, execute `dbt compile` to compile a project using dbt Cloud and validate your models and tests. + * If you're using the dbt Cloud CLI, you can connect to your data platform directly in the dbt Cloud interface and don't need a [`profiles.yml`](/docs/core/connect-data-platform/profiles.yml) file locally on your machine. -# dbt Cloud CLI -export DBT_CLOUD_API_KEY="1234" # Replace "1234" with your API key -``` +
-5. Load the new environment variable. Note: You may need to reactivate your Python virtual environment after sourcing your shell's dot file. Alternatively, restart your shell instead of sourcing the shell's dot file + -```bash -source ~/.zshrc -``` +Refer to the [FAQs](#faqs) if your operating system runs into path conflicts. -6. Navigate to a dbt project +1. Download the latest Windows release for your platform from [GitHub](https://github.com/dbt-labs/dbt-cli/releases). -```bash -cd ~/dbt-projects/jaffle_shop -``` +2. Extract the `dbt.exe` executable into the same folder as your dbt project. -7. Create a `dbt_cloud.yml` in the root project directory. The file is required to have a `project-id` field with a valid [project ID](#glossary). Enter the following commands: +:::info -```bash -pwd # Input -/Users/user/dbt-projects/jaffle_shop # Output -``` +Advanced users can configure multiple projects to use the same dbt Cloud CLI by placing the executable in the Program Files folder and [adding it to their Windows PATH environment variable](https://medium.com/@kevinmarkvi/how-to-add-executables-to-your-path-in-windows-5ffa4ce61a53). -```bash -echo "project-id: ''" > dbt_cloud.yml # Input -``` +Note that if you are using VS Code, you must restart it to pick up modified environment variables. +::: -```bash -cat dbt_cloud.yml # Input -project-id: '123456' # Output -``` +3. Verify your installation by running `./dbt --help` in the command line. If you see the following output, your installation is correct: + ```bash + The dbt Cloud CLI - an ELT tool for running SQL transformations and data models in dbt Cloud... + ``` -You can find your project ID by selecting your project and clicking on **Develop** in the navigation bar. Your project ID is the number in the URL: https://cloud.getdbt.com/develop/26228/projects/PROJECT_ID. + If you don't see this output, check that you've deactivated pyenv or venv and don't have a global dbt version installed. + + * Note that you no longer need to run the `dbt deps` command when your environment starts. This step was previously required during initialization. However, you should still run `dbt deps` if you make any changes to your `packages.yml` file. -If `dbt_cloud.yml` already exists, edit the file, and verify the project ID field uses a valid project ID. +4. After installation, [configure](/docs/cloud/configure-cloud-cli) the dbt Cloud CLI for your dbt Cloud project and use it to run [dbt commands](/reference/dbt-commands) similar to dbt Core. For example, execute `dbt compile`, to compile a project using dbt Cloud and confirm that it works. + * If you're using the dbt Cloud CLI, you can connect to your data platform directly in the dbt Cloud interface and don't need a [`profiles.yml`](/docs/core/connect-data-platform/profiles.yml) file locally on your machine. -#### Upgrade the CLI with Brew + -```bash -brew update -brew upgrade dbt-cloud-cli -``` + -## Using dbt Cloud CLI +Refer to the [FAQs](#faqs) if your operating system runs into path conflicts. -**Coming soon** +1. Download the latest Linux release for your platform from [GitHub](https://github.com/dbt-labs/dbt-cli/releases). (Pick the file based on your CPU architecture) -## Glossary +2. Extract the `dbt-cloud-cli` binary to the same folder as your dbt project. -- **dbt cloud API key:** Your API key found by navigating to the **gear icon**, clicking **Profile Settings**, and scrolling down to **API**. -- **Project ID:** The ID of the dbt project you're working with. Can be retrieved from the dbt Cloud URL after a project has been selected, for example, `https://cloud.getdbt.com/deploy/{accountID}/projects/{projectID}` -- **Development credentials:** Your personal warehouse credentials for the project you’re working with. They can be set by selecting the project and entering them in dbt Cloud. Navigate to the **gear icon**, click **Profile Settings**, and click **Credentials** from the left-side menu. + ```bash + tar -xf dbt_0.29.9_linux_amd64.tar.gz + ./dbt --version + ``` + +:::info + +Advanced users can configure multiple projects to use the same Cloud CLI executable by adding it to their PATH environment variable in their shell profile. + +::: + +3. Verify your installation by running `./dbt --help` in the command line. If you see the following output, your installation is correct: + ```bash + The dbt Cloud CLI - an ELT tool for running SQL transformations and data models in dbt Cloud... + ``` + + If you don't see this output, check that you've deactivated pyenv or venv and don't have a global dbt version installed. + + * Note that you no longer need to run the `dbt deps` command when your environment starts. This step was previously required during initialization. However, you should still run `dbt deps` if you make any changes to your `packages.yml` file. + +4. After installation, [configure](/docs/cloud/configure-cloud-cli) the dbt Cloud CLI for your dbt Cloud project and use it to run [dbt commands](/reference/dbt-commands) similar to dbt Core. For example, execute `dbt compile`, to compile a project using dbt Cloud and confirm that it works. + * If you're using the dbt Cloud CLI, you can connect to your data platform directly in the dbt Cloud interface and don't need a [`profiles.yml`](/docs/core/connect-data-platform/profiles.yml) file locally on your machine. + + + + + + +:::info Use native packages or a virtual environment to prevent dbt Core conflicts + +To prevent overwriting dbt Core, avoid installing the dbt Cloud CLI with pip. Instead, consider using the native installation method and configuring your PATH or create a new virtual environment. + +If you've already installed the dbt Cloud CLI and need to switch back to dbt Core, uninstall the dbt Cloud CLI, and follow the dbt Core installation instructions. + +You can also have both dbt Cloud CLI and dbt Core installed simultaneously. To avoid conflicts, alias the dbt Cloud CLI as `dbt-cloud`. For more details, check the [FAQs](#faqs) if your operating system experiences path conflicts. +::: + + +Before installing the dbt Cloud CLI, make sure you have Python installed and your virtual environment venv or pyenv . If you already have a Python environment configured, you can skip to the [pip installation step](#install-dbt-cloud-cli-in-pip). + + +### Install a virtual environment + +We recommend using virtual environments (venv) to namespace `cloud-cli`. + +1. Create a new venv: + ```shell + python3 -m venv dbt-cloud + ``` + +2. Activate the virtual environment each time you create a shell window or session: + ```shell + source dbt-cloud/bin/activate # activate the environment for Mac and Linux OR + dbt-env\Scripts\activate # activate the environment for Windows + ``` + +3. (Mac and Linux only) Create an alias to activate your dbt environment with every new shell window or session. You can add the following to your shell's configuration file (for example, $HOME/.bashrc, $HOME/.zshrc) while replacing `` with the path to your virtual environment configuration: + ```shell + alias env_dbt='source /bin/activate' + ``` + +### Install dbt Cloud CLI in pip + +1. (Optional) If you already have dbt Core installed, this installation will override that package. Note your dbt Core version in case you need to reinstall it later: + + ```bash + dbt --version + ``` + +2. Make sure you're in your virtual environment and run the following command to install the dbt Cloud CLI: + + ```bash + pip3 install dbt + ``` + +3. (Optional) To revert back to dbt Core, first uninstall both the dbt Cloud CLI and dbt Core +4. Reinstall dbt Core using the version from Step 2. + + ```bash + pip3 uninstall dbt-core dbt + pip3 install dbt-core==VERSION + ``` + +4. After you've verified the installation, [configure](/docs/cloud/configure-cloud-cli) the dbt Cloud CLI for your dbt Cloud project and use it to run [dbt commands](/reference/dbt-commands) similar to dbt Core. For example, execute `dbt compile` to compile a project using dbt Cloud and validate your models and tests. + * If you're using the dbt Cloud CLI, you can connect to your data platform directly in the dbt Cloud interface and don't need a [`profiles.yml`](/docs/core/connect-data-platform/profiles.yml) file locally on your machine. + + + +
+ + +## Update dbt Cloud CLI + +The following instructions explain how to update the dbt CLoud CLI to the latest version depending on your operating system. + +During the public preview period, we recommend updating before filing a bug report. This is because the API is subject to breaking changes. + + + + + + +To update the dbt Cloud CLI, run `brew upgrade dbt`. (You can also use `brew install dbt`). + + + + + +To update, follow the same process explained in [Windows](/docs/cloud/cloud-cli-installation?install=windows#install-dbt-cloud-cli) and replace the existing `dbt.exe` executable with the new one. + + + + + +To update, follow the same process explained in [Windows](/docs/cloud/cloud-cli-installation?install=linux#install-dbt-cloud-cli) and replace the existing `dbt` executable with the new one. + + + + + +To update: +- Make sure you're in your virtual environment +- Run `pip install --upgrade dbt`. + + + + + +## FAQs + +
+ +What's the difference between the dbt Cloud CLI and dbt Core? +The dbt Cloud CLI and dbt Core, an open-source project, are both command line tools that enable you to run dbt commands. The key distinction is the dbt Cloud CLI is tailored for dbt Cloud's infrastructure and integrates with all its features. + +
+ +
+How do I run both the dbt Cloud CLI and dbt Core? +For compatibility, both the dbt Cloud CLI and dbt Core are invoked by running dbt. This can create path conflicts if your operating system selects one over the other based on your $PATH environment variable (settings). + +If you have dbt Core installed locally, either: + +1. Install using [pip](/docs/cloud/cloud-cli-installation?install=pip#install-dbt-cloud-cli). + +2. Install natively, but ensure that you deactivate your Python environment or uninstall it using `pip uninstall dbt` before proceeding. + +3. (Advanced users) Install natively, but modify the $PATH environment variable to correctly point to the dbt Cloud CLI binary to use both dbt Cloud CLI and dbt Core together. + +You can always uninstall the dbt Cloud CLI to return to using dbt Core. +
+ +
+Why am I receiving a Session occupied error? +If you've ran a dbt command and receive a Session occupied error, you can reattach to your existing session with dbt reattach and then press Control-C and choose to cancel the invocation. +
diff --git a/website/docs/docs/cloud/configure-cloud-cli.md b/website/docs/docs/cloud/configure-cloud-cli.md new file mode 100644 index 00000000000..c05749fd016 --- /dev/null +++ b/website/docs/docs/cloud/configure-cloud-cli.md @@ -0,0 +1,99 @@ +--- +title: Configure dbt Cloud CLI +id: configure-cloud-cli +description: "Instructions on how to configure the dbt Cloud CLI" +sidebar_label: "Configure dbt Cloud CLI" +pagination_next: null +--- + +import CloudCLIFlag from '/snippets/_cloud-cli-flag.md'; + + + + +## Prerequisites + +- You must set up a project in dbt Cloud. + - **Note** — If you're using the dbt Cloud CLI, you can connect to your data platform directly in the dbt Cloud interface and don't need a [`profiles.yml`](/docs/core/connect-data-platform/profiles.yml) file. +- You must have your [personal development credentials](/docs/dbt-cloud-environments#set-developer-credentials) set for that project. The dbt Cloud CLI will use these credentials, stored securely in dbt Cloud, to communicate with your data platform. +- You must be on dbt version 1.5 or higher. Refer to [dbt Cloud versions](/docs/dbt-versions/upgrade-core-in-cloud) to upgrade. + +## Configure the dbt Cloud CLI + +Once you install the dbt Cloud CLI, you need to configure it to connect to a dbt Cloud project. + +1. Ensure you meet the prerequisites above. + +2. Download your credentials from dbt Cloud by clicking on the **Try the dbt Cloud CLI** banner on the dbt Cloud homepage. Alternatively, if you're in dbt Cloud, you can download the credentials from the links provided based on your region: + + - North America: https://cloud.getdbt.com/cloud-cli + - EMEA: https://emea.dbt.com/cloud-cli + - APAC: https://au.dbt.com/cloud-cli + - North American Cell 1: `https:/ACCOUNT_PREFIX.us1.dbt.com/cloud-cli` + - Single-tenant: `https://YOUR_ACCESS_URL/cloud-cli` + +3. Follow the banner instructions and download the config file to: + - Mac or Linux: `~/.dbt/dbt_cloud.yml` + - Windows: `C:\Users\yourusername\.dbt\dbt_cloud.yml` + + The config file looks like this: + + ```yaml + version: "1" + context: + active-project: "" + active-host: "" + defer-env-id: "" + projects: + - project-id: "" + account-host: "" + api-key: "" + + - project-id: "" + account-host: "" + api-key: "" + + ``` + +4. After downloading the config file, navigate to a dbt project in your terminal: + + ```bash + cd ~/dbt-projects/jaffle_shop + ``` + +5. In your `dbt_project.yml` file, ensure you have or include a `dbt-cloud` section with a `project-id` field. The `project-id` field contains the dbt Cloud project ID you want to use. + + ```yaml + # dbt_project.yml + name: + + version: + ... + + dbt-cloud: + project-id: PROJECT_ID + ``` + + - To find your project ID, select **Develop** in the dbt Cloud navigation menu. You can use the URL to find the project ID. For example, in `https://cloud.getdbt.com/develop/26228/projects/123456`, the project ID is `123456`. + +### Set environment variables + +To set environment variables in the dbt Cloud CLI for your dbt project: + +1. Select the gear icon on the upper right of the page. +2. Then select **Profile Settings**, then **Credentials**. +3. Click on your project and scroll to the **Environment Variables** section. +4. Click **Edit** on the lower right and then set the user-level environment variables. + +## Use the dbt Cloud CLI + +- The dbt Cloud CLI shares the same set of [dbt commands](/reference/dbt-commands) as dbt Core and processes the commands you invoke. +- It allows you to use automatic deferral of build artifacts to your Cloud project's production environment. +- It also supports [project dependencies](/docs/collaborate/govern/project-dependencies), which allows you to depend on another project using the metadata service in dbt Cloud. + - Project dependencies instantly connect to and reference (or `ref`) public models defined in other projects. This means you don't need to execute or analyze these upstream models yourself. Instead, you treat them as an API that returns a dataset. + +:::tip Use the --help flag +As a tip, most command-line tools have a `--help` flag to show available commands and arguments. Use the `--help` flag with dbt in two ways: +- `dbt --help`: Lists the commands available for dbt
+- `dbt run --help`: Lists the flags available for the `run` command +::: diff --git a/website/docs/docs/cloud/connect-data-platform/about-connections.md b/website/docs/docs/cloud/connect-data-platform/about-connections.md index 65bfac3a90d..1fe89c7273c 100644 --- a/website/docs/docs/cloud/connect-data-platform/about-connections.md +++ b/website/docs/docs/cloud/connect-data-platform/about-connections.md @@ -3,6 +3,8 @@ title: "About data platform connections" id: about-connections description: "Information about data platform connections" sidebar_label: "About data platform connections" +pagination_next: "docs/cloud/connect-data-platform/connect-starburst-trino" +pagination_prev: null --- dbt Cloud can connect with a variety of data platform providers including: - [Amazon Redshift](/docs/cloud/connect-data-platform/connect-redshift-postgresql-alloydb) @@ -13,6 +15,10 @@ dbt Cloud can connect with a variety of data platform providers including: - [Snowflake](/docs/cloud/connect-data-platform/connect-snowflake) - [Starburst or Trino](/docs/cloud/connect-data-platform/connect-starburst-trino) +import MSCallout from '/snippets/_microsoft-adapters-soon.md'; + + + You can connect to your database in dbt Cloud by clicking the gear in the top right and selecting **Account Settings**. From the Account Settings page, click **+ New Project**. diff --git a/website/docs/docs/cloud/connect-data-platform/connect-apache-spark.md b/website/docs/docs/cloud/connect-data-platform/connect-apache-spark.md index 670b628547b..0186d821a54 100644 --- a/website/docs/docs/cloud/connect-data-platform/connect-apache-spark.md +++ b/website/docs/docs/cloud/connect-data-platform/connect-apache-spark.md @@ -3,6 +3,7 @@ title: "Connect Apache Spark" id: connect-apache-spark description: "Setup instructions for connecting Apache Spark to dbt Cloud" sidebar_label: "Connect Apache Spark" +pagination_next: null --- diff --git a/website/docs/docs/cloud/connect-data-platform/connect-databricks.md b/website/docs/docs/cloud/connect-data-platform/connect-databricks.md index b66f5890c61..032246ad16a 100644 --- a/website/docs/docs/cloud/connect-data-platform/connect-databricks.md +++ b/website/docs/docs/cloud/connect-data-platform/connect-databricks.md @@ -26,6 +26,8 @@ Unity Catalog allows Databricks users to centrally manage all data assets, simpl To learn how to optimize performance with data platform-specific configurations in dbt Cloud, refer to [Databricks-specific configuration](/reference/resource-configs/databricks-configs). +To grant users or roles database permissions (access rights and privileges), refer to the [example permissions](/reference/database-permissions/databricks-permissions) page. + To set up the Databricks connection, supply the following fields: diff --git a/website/docs/docs/cloud/connect-data-platform/connect-redshift-postgresql-alloydb.md b/website/docs/docs/cloud/connect-data-platform/connect-redshift-postgresql-alloydb.md index dae0ee1d178..ee5b09e83ef 100644 --- a/website/docs/docs/cloud/connect-data-platform/connect-redshift-postgresql-alloydb.md +++ b/website/docs/docs/cloud/connect-data-platform/connect-redshift-postgresql-alloydb.md @@ -64,3 +64,5 @@ The Bastion server should now be ready for dbt Cloud to use as a tunnel into the ## Configuration To learn how to optimize performance with data platform-specific configurations in dbt Cloud, refer to [Redshift-specific configuration](/reference/resource-configs/redshift-configs). + +To grant users or roles database permissions (access rights and privileges), refer to the [Redshift permissions](/reference/database-permissions/redshift-permissions) page or [Postgres permissions](/reference/database-permissions/postgres-permissions) page. diff --git a/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md b/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md index 62a58f6e1c5..5f1c4cae725 100644 --- a/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md +++ b/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md @@ -15,7 +15,7 @@ The following fields are required when creating a Snowflake connection | Warehouse | The virtual warehouse to use for running queries. | `transforming` | -**Note:** A crucial part of working with dbt atop Snowflake is ensuring that users (in development environments) and/or service accounts (in deployment to production environments) have the correct permissions to take actions on Snowflake! Here is documentation of some [example permissions to configure Snowflake access](/reference/snowflake-permissions). +**Note:** A crucial part of working with dbt atop Snowflake is ensuring that users (in development environments) and/or service accounts (in deployment to production environments) have the correct permissions to take actions on Snowflake! Here is documentation of some [example permissions to configure Snowflake access](/reference/database-permissions/snowflake-permissions). ### Username / Password diff --git a/website/docs/docs/cloud/dbt-cloud-ide/dbt-cloud-ide.md b/website/docs/docs/cloud/dbt-cloud-ide/dbt-cloud-ide.md new file mode 100644 index 00000000000..3c41432bc62 --- /dev/null +++ b/website/docs/docs/cloud/dbt-cloud-ide/dbt-cloud-ide.md @@ -0,0 +1,37 @@ +--- +title: "dbt Cloud IDE" +description: "Learn how to configure Git in dbt Cloud" +pagination_next: "docs/cloud/dbt-cloud-ide/develop-in-the-cloud" +pagination_prev: null +--- + +
+ + + + + +
+
+
+ + + + +
\ No newline at end of file diff --git a/website/docs/docs/cloud/dbt-cloud-ide/dbt-cloud-tips.md b/website/docs/docs/cloud/dbt-cloud-ide/dbt-cloud-tips.md index cfae00b960e..39db7832d79 100644 --- a/website/docs/docs/cloud/dbt-cloud-ide/dbt-cloud-tips.md +++ b/website/docs/docs/cloud/dbt-cloud-ide/dbt-cloud-tips.md @@ -3,6 +3,7 @@ title: "Tips and tricks" id: dbt-cloud-tips description: "Check out any dbt Cloud and IDE-related tips." sidebar_label: "Tips and tricks" +pagination_next: null --- # dbt Cloud tips @@ -16,7 +17,7 @@ There are default keyboard shortcuts that can help make development more product - Press Fn-F1 to view a full list of the editor shortcuts - Command-O on macOS or Control-O on Windows to select a file to open - Command-P/Command-Shift-P on macOS or Control-P/Control-Shift-P on Windows to see the command palette -- Hold Option-click-on-area on macOS or Hold-Alt-click-on-area on Windows to select multiple lines and perform a multi-edit. You can also press Command-E to perform this operation on the command line. +- Hold Option-click-on-area or press Shift-Option-Command on macOS or Hold-Alt-click-on-area on Windows to select multiple lines and perform a multi-edit. You can also press Command-E to perform this operation on the command line. - Command-Enter on macOS or Control-Enter on Windows to Preview your code - Command-Shift-Enter on macOS or Control-Shift-Enter on Windows to Compile - Highlight a portion of code and use the above shortcuts to Preview or Compile code diff --git a/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md b/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md index c55e67cf93e..9fc382f0217 100644 --- a/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md +++ b/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md @@ -1,22 +1,31 @@ --- -title: "Develop in the IDE" +title: "About the dbt Cloud IDE" id: develop-in-the-cloud description: "Develop, test, run, and build in the Cloud IDE. With the Cloud IDE, you can compile dbt code into SQL and run it against your database directly" -sidebar_label: Develop in the IDE +sidebar_label: About the IDE tags: [IDE] +pagination_next: "docs/cloud/dbt-cloud-ide/ide-user-interface" +pagination_prev: null --- -The dbt Cloud integrated development environment (IDE) is a single interface for building, testing, running, and version-controlling dbt projects from your browser. With the Cloud IDE, you can compile dbt code into SQL and run it against your database directly. +The dbt Cloud integrated development environment (IDE) is a single web-based interface for building, testing, running, and version-controlling dbt projects. It compiles dbt code into SQL and executes it directly on your database. -## Prerequisites +The dbt Cloud IDE offers several [editing features](/docs/cloud/dbt-cloud-ide/ide-user-interface#editing-features) for faster and more efficient data platform development and governance: -To develop in the Cloud IDE, make sure you have the following: +- Syntax highlighting for SQL: Makes it easy to distinguish different parts of your code, reducing syntax errors and enhancing readability. +- Auto-completion: Suggests table names, arguments, and column names as you type, saving time and reducing typos. +- Code [formatting and linting](/docs/cloud/dbt-cloud-ide/lint-format): Help standardize and fix your SQL code effortlessly. +- Navigation tools: Easily move around your code, jump to specific lines, find and replace text, and navigate between project files. +- Version control: Manage code versions with a few clicks. -- A [dbt Cloud account](https://cloud.getdbt.com/) and [Developer seat license](/docs/cloud/manage-access/seats-and-users) -- A git repository set up and git provider must have `write` access enabled. See [Connecting your GitHub Account](/docs/cloud/git/connect-github) or [Importing a project by git URL](/docs/cloud/git/import-a-project-by-git-url) for detailed setup instructions -- A dbt project connected to a [data platform](/docs/cloud/connect-data-platform/about-connections) -- A [development environment and development credentials](#access-the-cloud-ide) set up -- The environment must be on dbt version 1.0 or higher +These [features](#dbt-cloud-ide-features) create a powerful editing environment for efficient SQL coding, suitable for both experienced and beginner developers. + + + + + + + :::tip Disable ad blockers @@ -24,21 +33,16 @@ To improve your experience using dbt Cloud, we suggest that you turn off ad bloc ::: -## Develop in the Cloud IDE - -The Cloud IDE is a powerful tool that can help streamline and govern your data platform development process. It offers a range of [editing features](/docs/cloud/dbt-cloud-ide/ide-user-interface#editing-features) that can help make your data platform development process faster and more efficient. Some of the editing features include: - -- The IDE has syntax highlighting for SQL. This makes it easy to visually distinguish between different parts of your code. This helps prevent syntax errors and improve readability. -- Use the IDE built-in auto-completion, which suggests table names, arguments, and column names as you type. This saves time and reduces the likelihood of typos or errors in your code. -- The code [formatting and linting](/docs/cloud/dbt-cloud-ide/lint-format) tools allow you to standardize and fix your SQL code with ease. -- The IDE has a range of navigation tools, making it easy to move around your code with ease. You can quickly jump to specific lines of code, find and replace text, and navigate between different files in your project. -- Use the version control menu and features to version-control your code with just a few clicks. +## Prerequisites -All of these [features](#cloud-ide-features) work together to create a powerful editing environment that can help you write and maintain high-quality SQL code in less time. Whether you're a seasoned developer or just starting out, the Cloud IDE has everything you need to be productive, collaborative, and efficient. +- A [dbt Cloud account](https://cloud.getdbt.com/) and [Developer seat license](/docs/cloud/manage-access/seats-and-users) +- A git repository set up and git provider must have `write` access enabled. See [Connecting your GitHub Account](/docs/cloud/git/connect-github) or [Importing a project by git URL](/docs/cloud/git/import-a-project-by-git-url) for detailed setup instructions +- A dbt project connected to a [data platform](/docs/cloud/connect-data-platform/about-connections) +- A [development environment and development credentials](#access-the-cloud-ide) set up +- The environment must be on dbt version 1.0 or higher - -## Cloud IDE features +## dbt Cloud IDE features The dbt Cloud IDE comes with [tips](/docs/cloud/dbt-cloud-ide/dbt-cloud-tips) and [features](/docs/cloud/dbt-cloud-ide/ide-user-interface) that make it easier for you to develop, build, compile, run, and test data models. @@ -89,9 +93,9 @@ The Cloud IDE needs explicit action to save your changes. There are three ways y ## Access the Cloud IDE -:::info📌 +:::tip Disable ad blockers -New to dbt? Check out our [quickstart guides](/quickstarts) to build your first dbt project in the Cloud IDE! +To improve your experience using dbt Cloud, we suggest that you turn off ad blockers. This is because some project file names, such as `google_adwords.sql`, might resemble ad traffic and trigger ad blockers. ::: @@ -155,13 +159,15 @@ The dbt Cloud IDE makes it possible to [build and view](/docs/collaborate/build-
- What is the difference between developing on the Cloud IDE and on the CLI? + What is the difference between developing on the dbt Cloud IDE, the dbt Cloud CLI, and dbt Core?
-
There are two main ways to develop with dbt: using the web-based IDE in dbt Cloud or using the command-line interface (CLI) in dbt Core:

- - dbt Cloud IDE dbt Cloud is a web-based application that allows you to develop dbt projects with the IDE, includes a purpose-built scheduler, and provides an easier way to share your dbt documentation with your team. The IDE is a faster and more reliable way to deploy your dbt models and provides a real-time editing and execution environment for your dbt project.

- - dbt Core CLI The command line interface (CLI) uses dbt Core, an open-source software that’s freely available. You can build your dbt project in a code editor, like Jetbrains or VSCode, and run dbt commands from the command line. +
You can develop dbt using the web-based IDE in dbt Cloud or on the command line interface using the dbt Cloud CLI or open-source dbt Core, all of which enable you to execute dbt commands. The key distinction between the dbt Cloud CLI and dbt Core is the dbt Cloud CLI is tailored for dbt Cloud's infrastructure and integrates with all its features.

+ + dbt Cloud IDE: dbt Cloud is a web-based application that allows you to develop dbt projects with the IDE, includes a purpose-built scheduler, and provides an easier way to share your dbt documentation with your team. The IDE is a faster and more reliable way to deploy your dbt models and provides a real-time editing and execution environment for your dbt project.

+ + dbt Cloud CLI: The dbt Cloud CLI allows you to run dbt commands against your dbt Cloud development environment from your local command line or code editor. It supports cross-project ref, speedier, lower-cost builds, automatic deferral of build artifacts, and more.

+ + dbt Core: dbt Core is an open-sourced software that’s freely available. You can build your dbt project in a code editor, and run dbt commands from the command line.
diff --git a/website/docs/docs/cloud/dbt-cloud-ide/ide-user-interface.md b/website/docs/docs/cloud/dbt-cloud-ide/ide-user-interface.md index de643413a8a..05910b23e7f 100644 --- a/website/docs/docs/cloud/dbt-cloud-ide/ide-user-interface.md +++ b/website/docs/docs/cloud/dbt-cloud-ide/ide-user-interface.md @@ -36,11 +36,13 @@ The IDE streamlines your workflow, and features a popular user interface layout * Added (A) — The IDE detects added files * Deleted (D) — The IDE detects deleted files. - + 5. **Command bar —** The Command bar, located in the lower left of the IDE, is used to invoke [dbt commands](/reference/dbt-commands). When a command is invoked, the associated logs are shown in the Invocation History Drawer. -6. **IDE Status button —** The IDE Status button, located on the lower right of the IDE, displays the current IDE status. If there is an error in the status or in the dbt code that stops the project from parsing, the button will turn red and display "Error". If there aren't any errors, the button will display a green "Ready" status. To access the [IDE Status modal](#modals-and-menus), simply click on this button. +6. **Defer to production —** The **Defer to production** toggle allows developers to only build and run and test models they've edited without having to first run and build all the models that come before them (upstream parents). Refer to [Using defer in dbt Cloud](/docs/cloud/about-cloud-develop-defer#defer-in-the-dbt-cloud-ide) for more info. + +7. **Status button —** The IDE Status button, located on the lower right of the IDE, displays the current IDE status. If there is an error in the status or in the dbt code that stops the project from parsing, the button will turn red and display "Error". If there aren't any errors, the button will display a green "Ready" status. To access the [IDE Status modal](#modals-and-menus), simply click on this button. ## Editing features diff --git a/website/docs/docs/cloud/dbt-cloud-ide/lint-format.md b/website/docs/docs/cloud/dbt-cloud-ide/lint-format.md index 8ffd83ef00e..6a86f1aa14b 100644 --- a/website/docs/docs/cloud/dbt-cloud-ide/lint-format.md +++ b/website/docs/docs/cloud/dbt-cloud-ide/lint-format.md @@ -45,7 +45,11 @@ With the dbt Cloud IDE, you can seamlessly use [SQLFluff](https://sqlfluff.com/) - Works with Jinja and SQL, - Comes with built-in [linting rules](https://docs.sqlfluff.com/en/stable/rules.html). You can also [customize](#customize-linting) your own linting rules. - Empowers you to [enable linting](#enable-linting) with options like **Lint** (displays linting errors and recommends actions) or **Fix** (auto-fixes errors in the IDE). -- Displays a **Code Quality** tab to view code errors, and provides code quality visibility and management. +- Displays a **Code Quality** tab to view code errors, and provides code quality visibility and management. + +:::info Ephemeral models not supported +Linting doesn't support ephemeral models in dbt v1.5 and lower. Refer to the [FAQs](#faqs) for more info. +::: ### Enable linting @@ -223,6 +227,12 @@ Currently, running SQLFluff commands from the terminal isn't supported. Make sure you're on a development branch. Formatting or Linting isn't available on "main" or "read-only" branches. +
+Why is there inconsistent SQLFluff behavior when running outside the dbt Cloud IDE (such as a GitHub Action)? +— Double-check your SQLFluff version matches the one in dbt Cloud IDE (found in the Code Quality tab after a lint operation).

+— If your lint operation passes despite clear rule violations, confirm you're not linting models with ephemeral models. Linting doesn't support ephemeral models in dbt v1.5 and lower. +
+ ## Related docs - [User interface](/docs/cloud/dbt-cloud-ide/ide-user-interface) diff --git a/website/docs/docs/cloud/git/authenticate-azure.md b/website/docs/docs/cloud/git/authenticate-azure.md index 03020ccca73..42028bf993b 100644 --- a/website/docs/docs/cloud/git/authenticate-azure.md +++ b/website/docs/docs/cloud/git/authenticate-azure.md @@ -3,10 +3,11 @@ title: "Authenticate with Azure DevOps" id: "authenticate-azure" description: "dbt Cloud developers need to authenticate with Azure DevOps." sidebar_label: "Authenticate with Azure DevOps" +pagination_next: null --- -If you use the dbt Cloud IDE to collaborate on your team's Azure DevOps dbt repo, you need to [link your dbt Cloud profile to Azure DevOps](#link-your-dbt-cloud-profile-to-azure-devops), which provides an extra layer of authentication. +If you use the dbt Cloud IDE or dbt Cloud CLI to collaborate on your team's Azure DevOps dbt repo, you need to [link your dbt Cloud profile to Azure DevOps](#link-your-dbt-cloud-profile-to-azure-devops), which provides an extra layer of authentication. ## Link your dbt Cloud profile to Azure DevOps diff --git a/website/docs/docs/cloud/git/connect-azure-devops.md b/website/docs/docs/cloud/git/connect-azure-devops.md index bc5bb81dd24..c138e042abc 100644 --- a/website/docs/docs/cloud/git/connect-azure-devops.md +++ b/website/docs/docs/cloud/git/connect-azure-devops.md @@ -1,6 +1,7 @@ --- title: "Connect to Azure DevOps" id: "connect-azure-devops" +pagination_next: "docs/cloud/git/setup-azure" --- @@ -13,7 +14,7 @@ Connect your Azure DevOps cloud account in dbt Cloud to unlock new product exper - Import new Azure DevOps repos with a couple clicks during dbt Cloud project setup. - Clone repos using HTTPS rather than SSH - Enforce user authorization with OAuth 2.0. -- Carry Azure DevOps user repository permissions (read / write access) through to dbt Cloud IDE's git actions. +- Carry Azure DevOps user repository permissions (read / write access) through to dbt Cloud IDE or dbt Cloud CLI's git actions. - Trigger Continuous integration (CI) builds when pull requests are opened in Azure DevOps. diff --git a/website/docs/docs/cloud/git/connect-github.md b/website/docs/docs/cloud/git/connect-github.md index 771e4286ef6..ff0f2fff18f 100644 --- a/website/docs/docs/cloud/git/connect-github.md +++ b/website/docs/docs/cloud/git/connect-github.md @@ -74,7 +74,7 @@ To connect a personal GitHub account: 4. Once you approve authorization, you will be redirected to dbt Cloud, and you should now see your connected account. -The next time you log into dbt Cloud, you will be able to do so via OAuth through GitHub, and if you're on the Enterprise plan, you're ready to use the dbt Cloud IDE. +The next time you log into dbt Cloud, you will be able to do so via OAuth through GitHub, and if you're on the Enterprise plan, you're ready to use the dbt Cloud IDE or dbt Cloud CLI. ## FAQs diff --git a/website/docs/docs/cloud/git/connect-gitlab.md b/website/docs/docs/cloud/git/connect-gitlab.md index 53fde5f4878..e55552e2d86 100644 --- a/website/docs/docs/cloud/git/connect-gitlab.md +++ b/website/docs/docs/cloud/git/connect-gitlab.md @@ -8,7 +8,7 @@ id: "connect-gitlab" Connecting your GitLab account to dbt Cloud provides convenience and another layer of security to dbt Cloud: - Import new GitLab repos with a couple clicks during dbt Cloud project setup. - Clone repos using HTTPS rather than SSH. -- Carry GitLab user permissions through to dbt Cloud IDE's git actions. +- Carry GitLab user permissions through to dbt Cloud or dbt Cloud CLI's git actions. - Trigger [Continuous integration](/docs/deploy/continuous-integration) builds when merge requests are opened in GitLab. The steps to integrate GitLab in dbt Cloud depend on your plan. If you are on: @@ -35,7 +35,7 @@ Once you've accepted, you should be redirected back to dbt Cloud, and you'll see dbt Cloud enterprise customers have the added benefit of bringing their own GitLab OAuth application to dbt Cloud. This tier benefits from extra security, as dbt Cloud will: - Enforce user authorization with OAuth. -- Carry GitLab's user repository permissions (read / write access) through to dbt Cloud IDE's git actions. +- Carry GitLab's user repository permissions (read / write access) through to dbt Cloud or dbt Cloud CLI's git actions. In order to connect GitLab in dbt Cloud, a GitLab account admin must: 1. [Set up a GitLab OAuth application](#setting-up-a-gitlab-oauth-application). @@ -97,7 +97,7 @@ You will then be redirected to GitLab and prompted to sign into your account. Gi Once you've accepted, you should be redirected back to dbt Cloud, and your integration is ready for developers on your team to [personally authenticate with](#personally-authenticating-with-gitlab). ### Personally authenticating with GitLab -dbt Cloud developers on the Enterprise plan must each connect their GitLab profiles to dbt Cloud, as every developer's read / write access for the dbt repo is checked in the dbt Cloud IDE. +dbt Cloud developers on the Enterprise plan must each connect their GitLab profiles to dbt Cloud, as every developer's read / write access for the dbt repo is checked in the dbt Cloud IDE or dbt Cloud CLI. To connect a personal GitLab account, dbt Cloud developers should navigate to Your Profile settings by clicking the gear icon in the top right, then select **Linked Accounts** in the left menu. @@ -105,7 +105,7 @@ If your GitLab account is not connected, you’ll see "No connected account". Se -Once you approve authorization, you will be redirected to dbt Cloud, and you should see your connected account. You're now ready to start developing in the dbt Cloud IDE. +Once you approve authorization, you will be redirected to dbt Cloud, and you should see your connected account. You're now ready to start developing in the dbt Cloud IDE or dbt Cloud CLI. ## Troubleshooting diff --git a/website/docs/docs/cloud/git/git-configuration-in-dbt-cloud.md b/website/docs/docs/cloud/git/git-configuration-in-dbt-cloud.md new file mode 100644 index 00000000000..fb8c0186236 --- /dev/null +++ b/website/docs/docs/cloud/git/git-configuration-in-dbt-cloud.md @@ -0,0 +1,37 @@ +--- +title: "Git configuration in dbt Cloud" +description: "Learn about the Git providers supported in dbt Cloud" +pagination_next: "docs/cloud/git/import-a-project-by-git-url" +pagination_prev: null +--- + +
+ + + + + +
+
+
+ + + + +
\ No newline at end of file diff --git a/website/docs/docs/cloud/git/import-a-project-by-git-url.md b/website/docs/docs/cloud/git/import-a-project-by-git-url.md index ba53baa33ea..83846bb1f0b 100644 --- a/website/docs/docs/cloud/git/import-a-project-by-git-url.md +++ b/website/docs/docs/cloud/git/import-a-project-by-git-url.md @@ -1,6 +1,8 @@ --- title: "Import a project by git URL" id: "import-a-project-by-git-url" +pagination_next: "docs/cloud/git/connect-github" +pagination_prev: null --- In dbt Cloud, you can import a git repository from any valid git URL that points to a dbt project. There are some important considerations to keep in mind when doing this. diff --git a/website/docs/docs/cloud/git/setup-azure.md b/website/docs/docs/cloud/git/setup-azure.md index 9eca77d7014..843371be6ea 100644 --- a/website/docs/docs/cloud/git/setup-azure.md +++ b/website/docs/docs/cloud/git/setup-azure.md @@ -93,7 +93,7 @@ Once you connect your Azure AD app and Azure DevOps, you need to provide dbt Clo - **Directory(tenant) ID:** Found in the Azure AD App. -Your Azure AD app should now be added to your dbt Cloud Account. People on your team who want to develop in dbt Cloud's IDE can now personally [authorize Azure DevOps from their profiles](/docs/cloud/git/authenticate-azure). +Your Azure AD app should now be added to your dbt Cloud Account. People on your team who want to develop in the dbt Cloud IDE or dbt Cloud CLI can now personally [authorize Azure DevOps from their profiles](/docs/cloud/git/authenticate-azure). ## Connect a service user diff --git a/website/docs/docs/cloud/manage-access/about-access.md b/website/docs/docs/cloud/manage-access/about-access.md index f9f97bc555d..d394c79baa3 100644 --- a/website/docs/docs/cloud/manage-access/about-access.md +++ b/website/docs/docs/cloud/manage-access/about-access.md @@ -2,6 +2,8 @@ title: "About user access in dbt Cloud" description: "Learn how dbt Cloud administrators can use dbt Cloud's permissioning model to control user-level access in a dbt Cloud account." id: "about-user-access" +pagination_next: "docs/cloud/manage-access/seats-and-users" +pagination_prev: null --- :::info "User access" is not "Model access" diff --git a/website/docs/docs/cloud/manage-access/audit-log.md b/website/docs/docs/cloud/manage-access/audit-log.md index 98bf660b259..b90bceef570 100644 --- a/website/docs/docs/cloud/manage-access/audit-log.md +++ b/website/docs/docs/cloud/manage-access/audit-log.md @@ -3,6 +3,8 @@ title: "The audit log for dbt Cloud Enterprise" id: audit-log description: "You can troubleshoot possible issues and provide security audits by reviewing event activity in your organization." sidebar_label: "Audit log" +pagination_next: null +pagination_prev: "docs/cloud/manage-access/about-user-access" --- To review actions performed by people in your organization, dbt provides logs of audited user and system events in real time. The audit log appears as events happen and includes details such as who performed the action, what the action was, and when it was performed. You can use these details to troubleshoot access issues, perform security audits, or analyze specific events. diff --git a/website/docs/docs/cloud/manage-access/cloud-seats-and-users.md b/website/docs/docs/cloud/manage-access/cloud-seats-and-users.md index 04dfbe093c3..24c64a5abed 100644 --- a/website/docs/docs/cloud/manage-access/cloud-seats-and-users.md +++ b/website/docs/docs/cloud/manage-access/cloud-seats-and-users.md @@ -3,6 +3,8 @@ title: "Users and licenses" description: "Learn how dbt Cloud administrators can use licenses and seats to control access in a dbt Cloud account." id: "seats-and-users" sidebar: "Users and licenses" +pagination_next: "docs/cloud/manage-access/self-service-permissions" +pagination_prev: null --- In dbt Cloud, _licenses_ are used to allocate users to your account. There are three different types of licenses in dbt Cloud: @@ -16,6 +18,7 @@ The user's assigned license determines the specific capabilities they can access | Functionality | Developer User | Read-Only Users | IT Users* | | ------------- | -------------- | --------------- | -------- | | Use the dbt Cloud IDE | ✅ | ❌ | ❌ | +| Use the dbt Cloud CLI | ✅ | ❌ | ❌ | | Use Jobs | ✅ | ❌ | ❌ | | Manage Account | ✅ | ❌ | ✅ | | API Access | ✅ | ❌ | ❌ | diff --git a/website/docs/docs/cloud/manage-access/enterprise-permissions.md b/website/docs/docs/cloud/manage-access/enterprise-permissions.md index 5bf3623b105..dcacda20deb 100644 --- a/website/docs/docs/cloud/manage-access/enterprise-permissions.md +++ b/website/docs/docs/cloud/manage-access/enterprise-permissions.md @@ -3,6 +3,7 @@ title: "Enterprise permissions" id: "enterprise-permissions" description: "Permission sets for Enterprise plans." hide_table_of_contents: true #For the sake of the tables on this page +pagination_next: null --- import Permissions from '/snippets/_enterprise-permissions-table.md'; @@ -21,10 +22,6 @@ The following roles and permission sets are available for assignment in dbt Clou -## Diagram of the permission sets - - - ## How to set up RBAC Groups in dbt Cloud Role-Based Access Control (RBAC) is helpful for automatically assigning permissions to dbt admins based on their SSO provider group associations. diff --git a/website/docs/docs/cloud/manage-access/self-service-permissions.md b/website/docs/docs/cloud/manage-access/self-service-permissions.md index 21cc765b76d..d3c9cf8f5ea 100644 --- a/website/docs/docs/cloud/manage-access/self-service-permissions.md +++ b/website/docs/docs/cloud/manage-access/self-service-permissions.md @@ -12,7 +12,8 @@ The permissions afforded to each role are described below: | ------ | ------ | ----- | | View and edit resources | ✅ | ✅ | | Trigger runs | ✅ | ✅ | -| Access the IDE | ✅ | ✅ | +| Access the dbt Cloud IDE | ✅ | ✅ | +| Access the dbt Cloud CLI | ✅ | ✅ | | Invite Members to the account | ✅ | ✅ | | Manage billing | ❌ | ✅ | | Manage team permissions | ❌ | ✅ | diff --git a/website/docs/docs/cloud/manage-access/set-up-bigquery-oauth.md b/website/docs/docs/cloud/manage-access/set-up-bigquery-oauth.md index 516a340c951..87018b14d56 100644 --- a/website/docs/docs/cloud/manage-access/set-up-bigquery-oauth.md +++ b/website/docs/docs/cloud/manage-access/set-up-bigquery-oauth.md @@ -1,7 +1,8 @@ --- title: "Set up BigQuery OAuth" -description: "Learn how dbt Cloud administrators can use licenses and seats to control access in a dbt Cloud account." +description: "Learn how dbt Cloud administrators can use BigQuery OAuth to control access in a dbt Cloud account" id: "set-up-bigquery-oauth" +pagination_next: null --- :::info Enterprise Feature @@ -73,3 +74,7 @@ You will then be redirected to BigQuery and asked to approve the drive, cloud pl Select **Allow**. This redirects you back to dbt Cloud. You should now be an authenticated BigQuery user, ready to use the dbt Cloud IDE. + +## FAQs + + diff --git a/website/docs/docs/cloud/manage-access/set-up-databricks-oauth.md b/website/docs/docs/cloud/manage-access/set-up-databricks-oauth.md new file mode 100644 index 00000000000..679133b7844 --- /dev/null +++ b/website/docs/docs/cloud/manage-access/set-up-databricks-oauth.md @@ -0,0 +1,77 @@ +--- +title: "Set up Databricks OAuth" +description: "Learn how dbt Cloud administrators can use Databricks OAuth to control access in a dbt Cloud account." +id: "set-up-databricks-oauth" +--- + +:::info Enterprise Feature + +This guide describes a feature of the dbt Cloud Enterprise plan. If you’re interested in learning more about an Enterprise plan, contact us at sales@getdbt.com. + +::: + +dbt Cloud supports developer OAuth ([OAuth for partner solutions](https://docs.databricks.com/en/integrations/manage-oauth.html)) with Databricks, providing an additional layer of security for dbt enterprise users. When you enable Databricks OAuth for a dbt Cloud project, all dbt Cloud developers must authenticate with Databricks in order to use the dbt Cloud IDE. The project's deployment environments will still leverage the Databricks authentication method set at the environment level. + +:::tip Beta Feature + +Databricks OAuth support in dbt Cloud is a [beta feature](/docs/dbt-versions/product-lifecycles#dbt-cloud) and subject to change without notification. More updates to this feature coming soon. + +Current limitations: +- Databrick's OAuth applications are in public preview +- The current experience requires the IDE to be restarted every hour (access tokens expire after 1 hour - [workaround](https://docs.databricks.com/en/integrations/manage-oauth.html#override-the-default-token-lifetime-policy-for-dbt-core-power-bi-or-tableau-desktop)) + +::: + +### Configure Databricks OAuth (Databricks admin) + +To get started, you will need to [add dbt as an OAuth application](https://docs.databricks.com/en/integrations/configure-oauth-dbt.html) with Databricks, in 2 steps: + +1. From your terminal, [authenticate to the Databricks Account API](https://docs.databricks.com/en/integrations/configure-oauth-dbt.html#authenticate-to-the-account-api) with the Databricks CLI. You authenticate using: + - OAuth for users ([prerequisites](https://docs.databricks.com/en/dev-tools/auth.html#oauth-u2m-auth)) + - Oauth for service principals ([prerequisites](https://docs.databricks.com/en/dev-tools/auth.html#oauth-m2m-auth)) + - Username and password (must be account admin) +2. In the same terminal, **add dbt Cloud as an OAuth application** using `curl` and the [OAuth Custom App Integration API](https://docs.databricks.com/api/account/customappintegration/create) + +For the second step, you can use this example `curl` to authenticate with your username and password, replacing values as defined in the following table: + +```shell +curl -u USERNAME:PASSWORD https://accounts.cloud.databricks.com/api/2.0/accounts/ACCOUNT_ID/oauth2/custom-app-integrations -d '{"redirect_urls": ["https://YOUR_ACCESS_URL", "https://YOUR_ACCESS_URL/complete/databricks"], "confidential": true, "name": "NAME", "scopes": ["sql", "offline_access"]}' +``` + +These parameters and descriptions will help you authenticate with your username and password: + +| Parameter | Description | +| ------ | ----- | +| **USERNAME** | Your Databricks username (account admin level) | +| **PASSWORD** | Your Databricks password (account admin level) | +| **ACCOUNT_ID** | Your Databricks [account ID](https://docs.databricks.com/en/administration-guide/account-settings/index.html#locate-your-account-id) | +| **YOUR_ACCESS_URL** | The [appropriate Access URL](/docs/cloud/about-cloud/regions-ip-addresses) for your dbt Cloud account region and plan | +| **NAME** | The integration name (i.e 'databricks-dbt-cloud') + +After running the `curl`, you'll get an API response that includes the `client_id` and `client_secret` required in the following section. At this time, this is the only way to retrieve the secret. If you lose the secret, then the integration needs to be [deleted](https://docs.databricks.com/api/account/customappintegration/delete) and re-created. + + +### Configure the Connection in dbt Cloud (dbt Cloud project admin) + +Now that you have an OAuth app set up in Databricks, you'll need to add the client ID and secret to dbt Cloud. To do so: + - go to Settings by clicking the gear in the top right. + - on the left, select **Projects** under **Account Settings** + - choose your project from the list + - select **Connection** to edit the connection details + - add the `OAuth Client ID` and `OAuth Client Secret` from the Databricks OAuth app under the **Optional Settings** section + + + +### Authenticating to Databricks (dbt Cloud IDE developer) + +Once the Databricks connection via OAuth is set up for a dbt Cloud project, each dbt Cloud user will need to authenticate with Databricks in order to use the IDE. To do so: + +- Click the gear icon at the top right and select **Profile settings**. +- Select **Credentials**. +- Choose your project from the list +- Select `OAuth` as the authentication method, and click **Save** +- Finalize by clicking the **Connect Databricks Account** button + + + +You will then be redirected to Databricks and asked to approve the connection. This redirects you back to dbt Cloud. You should now be an authenticated Databricks user, ready to use the dbt Cloud IDE. diff --git a/website/docs/docs/cloud/manage-access/set-up-sso-azure-active-directory.md b/website/docs/docs/cloud/manage-access/set-up-sso-azure-active-directory.md index 349c3d8ecd7..28d20b526db 100644 --- a/website/docs/docs/cloud/manage-access/set-up-sso-azure-active-directory.md +++ b/website/docs/docs/cloud/manage-access/set-up-sso-azure-active-directory.md @@ -144,7 +144,7 @@ To complete setup, follow the steps below in the dbt Cloud application. | ----- | ----- | | **Log in with** | Azure AD Single Tenant | | **Client ID** | Paste the **Application (client) ID** recorded in the steps above | -| **Client Secret** | Paste the **Client Secret** (remember to use the Secret Value instead of the Secret ID) recorded in the steps above | +| **Client Secret** | Paste the **Client Secret** (remember to use the Secret Value instead of the Secret ID) recorded in the steps above;
**Note:** When the client secret expires, an Azure AD admin will have to generate a new one to be pasted into dbt Cloud for uninterrupted application access. | | **Tenant ID** | Paste the **Directory (tenant ID)** recorded in the steps above | | **Domain** | Enter the domain name for your Azure directory (such as `fishtownanalytics.com`). Only use the primary domain; this won't block access for other domains. | | **Slug** | Enter your desired login slug. Users will be able to log into dbt Cloud by navigating to `https://YOUR_ACCESS_URL/enterprise-login/LOGIN-SLUG`, replacing `YOUR_ACCESS_URL` with the [appropriate Access URL](/docs/cloud/manage-access/sso-overview#auth0-multi-tenant-uris) for your region and plan. Login slugs must be unique across all dbt Cloud accounts, so pick a slug that uniquely identifies your company. | diff --git a/website/docs/docs/cloud/manage-access/set-up-sso-okta.md b/website/docs/docs/cloud/manage-access/set-up-sso-okta.md index 5ec70443d1f..4079cc488c4 100644 --- a/website/docs/docs/cloud/manage-access/set-up-sso-okta.md +++ b/website/docs/docs/cloud/manage-access/set-up-sso-okta.md @@ -171,7 +171,7 @@ configured in the steps above. | **Log in with** | Okta | | **Identity Provider SSO Url** | Paste the **Identity Provider Single Sign-On URL** shown in the Okta setup instructions | | **Identity Provider Issuer** | Paste the **Identity Provider Issuer** shown in the Okta setup instructions | -| **X.509 Certificate** | Paste the **X.509 Certificate** shown in the Okta setup instructions | +| **X.509 Certificate** | Paste the **X.509 Certificate** shown in the Okta setup instructions;
**Note:** When the certificate expires, an Okta admin will have to generate a new one to be pasted into dbt Cloud for uninterrupted application access. | | **Slug** | Enter your desired login slug. Users will be able to log into dbt Cloud by navigating to `https://YOUR_ACCESS_URL/enterprise-login/LOGIN-SLUG`, replacing `YOUR_ACCESS_URL` with the [appropriate Access URL](/docs/cloud/about-cloud/regions-ip-addresses) for your region and plan. Login slugs must be unique across all dbt Cloud accounts, so pick a slug that uniquely identifies your company. | **Note:** When the certificate expires, an Idp admin will have to generate a new one to be pasted into dbt Cloud for uninterrupted application access. | | Slug | Enter your desired login slug. | diff --git a/website/docs/docs/cloud/manage-access/sso-overview.md b/website/docs/docs/cloud/manage-access/sso-overview.md index 7e44859c73a..f613df7907e 100644 --- a/website/docs/docs/cloud/manage-access/sso-overview.md +++ b/website/docs/docs/cloud/manage-access/sso-overview.md @@ -1,7 +1,8 @@ --- -title: "SSO Overview" +title: "Single sign-on (SSO) Overview" id: "sso-overview" - +pagination_next: "docs/cloud/manage-access/set-up-sso-saml-2.0" +pagination_prev: null --- This overview explains how users are provisioned in dbt Cloud via Single Sign-On (SSO). diff --git a/website/docs/docs/cloud/secure/databricks-privatelink.md b/website/docs/docs/cloud/secure/databricks-privatelink.md index c136cd8a0f9..a2c9e208459 100644 --- a/website/docs/docs/cloud/secure/databricks-privatelink.md +++ b/website/docs/docs/cloud/secure/databricks-privatelink.md @@ -3,6 +3,7 @@ title: "Configuring Databricks PrivateLink" id: databricks-privatelink description: "Configuring PrivateLink for Databricks" sidebar_label: "PrivateLink for Databricks" +pagination_next: null --- The following steps will walk you through the setup of a Databricks AWS PrivateLink endpoint in the dbt Cloud multi-tenant environment. diff --git a/website/docs/docs/cloud/secure/ip-restrictions.md b/website/docs/docs/cloud/secure/ip-restrictions.md index 237de991c02..093d2a1c876 100644 --- a/website/docs/docs/cloud/secure/ip-restrictions.md +++ b/website/docs/docs/cloud/secure/ip-restrictions.md @@ -3,6 +3,8 @@ title: "Configuring IP restrictions" id: ip-restrictions description: "Configuring IP restrictions to outside traffic from accessing your dbt Cloud environment" sidebar_label: "IP restrictions" +pagination_next: "docs/cloud/secure/about-privatelink" +pagination_prev: null --- import SetUpPages from '/snippets/_available-tiers-iprestrictions.md'; diff --git a/website/docs/docs/cloud/secure/secure-your-tenant.md b/website/docs/docs/cloud/secure/secure-your-tenant.md new file mode 100644 index 00000000000..95cb8adffba --- /dev/null +++ b/website/docs/docs/cloud/secure/secure-your-tenant.md @@ -0,0 +1,49 @@ +--- +title: "Secure your tenant" +description: "Learn how to secure your tenant for dbt Cloud" +pagination_next: "docs/cloud/secure/ip-restrictions" +pagination_prev: null +--- + +
+ + + + + + + +
+
+
+ + + + + + +
\ No newline at end of file diff --git a/website/docs/docs/collaborate/cloud-build-and-view-your-docs.md b/website/docs/docs/collaborate/cloud-build-and-view-your-docs.md index 36f4781bfde..b387c64788f 100644 --- a/website/docs/docs/collaborate/cloud-build-and-view-your-docs.md +++ b/website/docs/docs/collaborate/cloud-build-and-view-your-docs.md @@ -2,6 +2,7 @@ title: "Build and view your docs with dbt Cloud" id: "build-and-view-your-docs" description: "Automatically generate project documentation as you run jobs." +pagination_next: null --- dbt enables you to generate documentation for your project and data warehouse, and renders the documentation in a website. For more information, see [Documentation](/docs/collaborate/documentation). @@ -39,16 +40,17 @@ To create and schedule documentation-only jobs at the end of your production job You configure project documentation to generate documentation when the job you set up in the previous section runs. In the project settings, specify the job that generates documentation artifacts for that project. Once you configure this setting, subsequent runs of the job will automatically include a step to generate documentation. 1. Click the gear icon in the top right. -2. Select **Projects** and click the project that needs documentation. -3. Click **Edit**. -4. Under "Artifacts," select the job that should generate docs when it runs. +2. Select **Account Settings**. +3. Navigate to **Projects** and select the project that needs documentation. +4. Click **Edit**. +5. Under **Artifacts**, select the job that should generate docs when it runs. -5. Click **Save**. +6. Click **Save**. ## Generating documentation -To generate documentation in the IDE, run the `dbt docs generate` command in the -Command Bar in the IDE. This command will generate the Docs for your dbt project as it exists in development in your IDE session. +To generate documentation in the dbt Cloud IDE, run the `dbt docs generate` command in the +Command Bar in the dbt Cloud IDE. This command will generate the Docs for your dbt project as it exists in development in your IDE session. diff --git a/website/docs/docs/collaborate/collaborate-with-others.md b/website/docs/docs/collaborate/collaborate-with-others.md new file mode 100644 index 00000000000..7875a8044b6 --- /dev/null +++ b/website/docs/docs/collaborate/collaborate-with-others.md @@ -0,0 +1,38 @@ +--- +title: "Collaborate with others" +description: "Learn how dbt Cloud makes it easier to collaborate with others" +pagination_next: "docs/collaborate/explore-projects" +pagination_prev: null +--- + +
+ + + + + +
+
+
+ + + + + +
\ No newline at end of file diff --git a/website/docs/docs/collaborate/documentation.md b/website/docs/docs/collaborate/documentation.md index 429b5187152..0fa00c7cca2 100644 --- a/website/docs/docs/collaborate/documentation.md +++ b/website/docs/docs/collaborate/documentation.md @@ -2,6 +2,8 @@ title: "About documentation" description: "Learn how good documentation for your dbt models helps stakeholders discover and understand your datasets." id: "documentation" +pagination_next: "docs/collaborate/build-and-view-your-docs" +pagination_prev: null --- ## Related documentation diff --git a/website/docs/docs/collaborate/explore-projects.md b/website/docs/docs/collaborate/explore-projects.md index a4c914259ef..b041cd0c915 100644 --- a/website/docs/docs/collaborate/explore-projects.md +++ b/website/docs/docs/collaborate/explore-projects.md @@ -1,25 +1,16 @@ --- -title: "Explore your dbt projects (beta)" -sidebar_label: "Explore dbt projects (beta)" +title: "Explore your dbt projects" +sidebar_label: "Explore dbt projects" description: "Learn about dbt Explorer and how to interact with it to understand, improve, and leverage your data pipelines." +pagination_next: null +pagination_prev: null --- -With dbt Explorer, you can view your project's [resources](/docs/build/projects) (such as models, tests, and metrics) and their lineage to gain a better understanding of its latest production state. Navigate and manage your projects within dbt Cloud to help your data consumers discover and leverage your dbt resources. +With dbt Explorer, you can view your project's [resources](/docs/build/projects) (such as models, tests, and metrics) and their lineage to gain a better understanding of its latest production state. Navigate and manage your projects within dbt Cloud to help you and other data developers, analysts, and consumers discover and leverage your dbt resources. -To display the details about your [project state](/docs/dbt-cloud-apis/project-state), dbt Explorer utilizes the metadata provided through the [Discovery API](/docs/dbt-cloud-apis/discovery-api). The metadata that's available on your project depends on the [deployment environment](/docs/deploy/deploy-environments) you've designated as _production_ in your dbt Cloud project. dbt Explorer automatically retrieves the metadata updates after each job run in the production deployment environment so it will always have the latest state on your project. The metadata it displays depends on the [commands executed by the jobs](/docs/deploy/job-commands). For instance: +:::tip Public preview -- To update model details or results, you must run `dbt run` or `dbt build` on a given model within a job in the environment. -- To view catalog statistics and columns, you must run `dbt docs generate` within a job in the environment. -- To view test results, you must run `dbt test` or `dbt build` within a job in the environment. -- To view source freshness check results, you must run `dbt source freshness` within a job in the environment. - -The need to run these commands will diminish, and richer, more timely metadata will become available as the Discovery API and its underlying platform evolve. - -:::tip Join the beta - -dbt Explorer is a [beta feature](/docs/dbt-versions/product-lifecycles#dbt-cloud) and subject to change without notification. More updates to this feature coming soon. - -If you’re interested in joining the beta, please contact your account team. +Try dbt Explorer! It's available in [Public Preview](/docs/dbt-versions/product-lifecycles#dbt-cloud) as of October 17, 2023 for dbt Cloud customers. More updates coming soon. ::: @@ -28,115 +19,218 @@ If you’re interested in joining the beta, please contact your account team. - You have a [multi-tenant](/docs/cloud/about-cloud/tenancy#multi-tenant) or AWS single-tenant dbt Cloud account on the [Team or Enterprise plan](https://www.getdbt.com/pricing/). - You have set up a [production deployment environment](/docs/deploy/deploy-environments#set-as-production-environment-beta) for each project you want to explore. - There has been at least one successful job run in the production deployment environment. -- You are on the dbt Explorer page. This requires the feature to be enabled for your account. - - To go to the page, select **Explore (Beta)** from the top navigation bar in dbt Cloud. +- You are on the dbt Explorer page. To do this, select **Explore** from the top navigation bar in dbt Cloud. + + +## Generate metadata + +dbt Explorer uses the metadata provided by the [Discovery API](/docs/dbt-cloud-apis/discovery-api) to display the details about [the state of your project](/docs/dbt-cloud-apis/project-state). The metadata that's available depends on the [deployment environment](/docs/deploy/deploy-environments) you've designated as _production_ in your dbt Cloud project. dbt Explorer automatically retrieves the metadata updates after each job run in the production deployment environment so it always has the latest results for your project. + +To view a resource and its metadata, you must define the resource in your project and run a job in the production environment. The resulting metadata depends on the [commands executed by the jobs](/docs/deploy/job-commands). + +For a richer experience with dbt Explorer, you must: + +- Run [dbt run](/reference/commands/run) or [dbt build](/reference/commands/build) on a given model within a job in the environment to update model details or results. +- Run [dbt docs generate](/reference/commands/cmd-docs) within a job in the environment to view catalog statistics and columns for models, sources, and snapshots. +- Run [dbt test](/reference/commands/test) or [dbt build](/reference/commands/build) within a job in the environment to view test results. +- Run [dbt source freshness](/reference/commands/source#dbt-source-freshness) within a job in the environment to view source freshness data. +- Run [dbt snapshot](/reference/commands/snapshot) or [dbt build](/reference/commands/build) within a job in the environment to view snapshot details. + +Richer and more timely metadata will become available as dbt, the Discovery API, and the underlying dbt Cloud platform evolves. -## Explore the project’s lineage +## Explore your project's lineage graph {#project-lineage} -dbt Explorer provides a visualization of your project’s DAG that you can interact with. To start, select **Overview** in the left sidebar and click the **Explore Lineage** button on the main (center) section of the page. +dbt Explorer provides a visualization of your project’s DAG that you can interact with. To access the project's full lineage graph, select **Overview** in the left sidebar and click the **Explore Lineage** button on the main (center) section of the page. -If you don't see the lineage graph immediately, click **Render Lineage**. It can take some time for the graph to render depending on the size of your project and your computer’s available memory. The graph of very large projects might not render so, instead, you can select a subset of nodes by using selectors. +If you don't see the project lineage graph immediately, click **Render Lineage**. It can take some time for the graph to render depending on the size of your project and your computer’s available memory. The graph of very large projects might not render so you can select a subset of nodes by using selectors, instead. -The nodes in the lineage graph represent the project’s resources and the edges represent the relationships between the nodes. Resources like tests and macros display in the lineage within their [resource details pages](#view-resource-details) but not within the overall project lineage graph. Nodes are color-coded and include iconography according to their resource type. +The nodes in the lineage graph represent the project’s resources and the edges represent the relationships between the nodes. Nodes are color-coded and include iconography according to their resource type. -To interact with the lineage graph, you can: +To explore the lineage graphs of tests and macros, view [their resource details pages](#view-resource-details). By default, dbt Explorer excludes these resources from the full lineage graph unless a search query returns them as results. + +To interact with the full lineage graph, you can: - Hover over any item in the graph to display the resource’s name and type. - Zoom in and out on the graph by mouse-scrolling. -- Grab and move the graph. -- Click on a resource to highlight its relationship with other resources in your project. -- [Search and select specific resources](#search-resources) or a subset of the DAG using selectors and lineage (for example, `+[YOUR_RESOURCE_NAME]` displays all nodes upstream of a particular resource). -- [View resource details](#view-resource-details) by selecting a node in the graph (double-clicking). +- Grab and move the graph and the nodes. +- Select a resource to highlight its relationship with other resources in your project. A panel opens on the graph’s right-hand side that displays a high-level summary of the resource’s details. The side panel includes a **General** tab for information like description, materialized type, and other details. + - Click the Share icon in the side panel to copy the graph’s link to your clipboard. + - Click the View Resource icon in the side panel to [view the resource details](#view-resource-details). +- [Search and select specific resources](#search-resources) or a subset of the DAG using selectors and graph operators. For example: + - `+[RESOURCE_NAME]` — Displays all parent nodes of the resource + - `resource_type:model [RESOURCE_NAME]` — Displays all models matching the name search +- [View resource details](#view-resource-details) by selecting a node (double-clicking) in the graph. +- Click the List view icon in the graph's upper right corner to return to the main **Explore** page. - + ## Search for resources {#search-resources} -With the search bar (on the upper left of the page or in a lineage graph), you can search using keywords or selectors (also known as *selector methods*). The resources that match your search criteria will display as a table in the main section of the page. When you select a resource in the table, its [resource details page](#view-resource-details) will display. +With the search bar (on the upper left corner of the page or in a lineage graph), you can search with keywords or by using [node selection syntax](/reference/node-selection/syntax). The resources that match your search criteria will display as a lineage graph and a table in the main section of the page. + +Select a node (single-click) in the lineage graph to highlight its relationship with your other search results and to display which project contains the resource's definition. When you choose a node (double-click) in the lineage graph or when you select a resource in the table, dbt Explorer displays the [resource's details page](#view-resource-details). -When using keyword search, dbt Explorer will search through your resources using metadata such as resource type, resource name, column name, source name, tags, schema, database, version, alias/identifier, and package name. +### Search with keywords +When searching with keywords, dbt Explorer searches through your resource metadata (such as resource type, resource name, column name, source name, tags, schema, database, version, alias/identifier, and package name) and returns any matches. -When using selector search, you can utilize the dbt node selection syntax including set and graph operators (like `+`). To learn more about selectors, refer to [Syntax overview](/reference/node-selection/syntax), [Graph operators](/reference/node-selection/graph-operators), and [Set operators](/reference/node-selection/set-operators). +### Search with selector methods -Below are the selection methods currently available in dbt Explorer. For more information about each of them, refer to [Methods](/reference/node-selection/methods). +You can search with [selector methods](/reference/node-selection/methods). Below are the selectors currently available in dbt Explorer: -- **fqn:** — Find resources by [file or fully qualified name](/reference/node-selection/methods#the-file-or-fqn-method). -- **source:** — Find resources by a specified [source](/reference/node-selection/methods#the-source-method). -- **resource_type:** — Find resources by their [type](/reference/node-selection/methods#the-resource_type-method). -- **package:** — Find resources by the [dbt package](/reference/node-selection/methods#the-package-method) that defines them. -- **tag:** — Find resources by a specified [tag](/reference/node-selection/methods#the-tag-method). +- `fqn:` — Find resources by [file or fully qualified name](/reference/node-selection/methods#the-fqn-method). This selector is the search bar's default. If you want to use the default, it's unnecessary to add `fqn:` before the search term. +- `source:` — Find resources by a specified [source](/reference/node-selection/methods#the-source-method). +- `resource_type:` — Find resources by their [type](/reference/node-selection/methods#the-resource_type-method). +- `package:` — Find resources by the [dbt package](/reference/node-selection/methods#the-package-method) that defines them. +- `tag:` — Find resources by a specified [tag](/reference/node-selection/methods#the-tag-method). -- **group:** — Find models defined within a specified [group](/reference/node-selection/methods#the-group-method). -- **access:** — Find models based on their [access](/reference/node-selection/methods#the-access-method) property. +- `group:` — Find models defined within a specified [group](/reference/node-selection/methods#the-group-method). +- `access:` — Find models based on their [access](/reference/node-selection/methods#the-access-method) property. - +### Search with graph operators + +You can use [graph operators](/reference/node-selection/graph-operators) on keywords or selector methods. For example, `+orders` returns all the parents of `orders`. + +### Search with set operators + +You can use multiple selector methods in your search query with [set operators](/reference/node-selection/set-operators). A space implies a union set operator and a comma for an intersection. For example: +- `resource_type:metric,tag:nightly` — Returns metrics with the tag `nightly` +- `+snowplow_sessions +fct_orders` — Returns resources that are parent nodes of either `snowplow_sessions` or `fct_orders` -## Use the catalog sidebar +### Search with both keywords and selector methods -By default, the catalog sidebar lists all your project’s resources. Select any resource type in the list and all those resources in the project will display as a table in the main section of the page. For a description on the different resource types (like models, metrics, and so on), refer to [About dbt projects](https://docs.getdbt.com/docs/build/projects). +You can use keyword search to highlight results that are filtered by the selector search. For example, if you don't have a resource called `customers`, then `resource_type:metric customers` returns all the metrics in your project and highlights those that are related to the term `customers` in the name, in a column, tagged as customers, and so on. + +When searching in this way, the selectors behave as filters that you can use to narrow the search and keywords as a way to find matches within those filtered results. + + + +## Browse with the sidebar + +By default, the catalog sidebar lists all your project’s resources. Select any resource type in the list and all those resources in the project will display as a table in the main section of the page. For a description on the different resource types (like models, metrics, and so on), refer to [About dbt projects](/docs/build/projects). To browse using a different view, you can choose one of these options from the **View by** dropdown: - **Resources** (default) — All resources in the project organized by type. -- **Packages** — All resources in the project organized by the project in which they are defined. +- **Packages** — All resources in the project organized by the dbt package in which they are defined. - **File Tree** — All resources in the project organized by the file in which they are defined. This mirrors the file tree in your dbt project repository. -- **Database** — All resources in the project organized by the database and schema in which they are built. This mirrors your data platform structure. +- **Database** — All resources in the project organized by the database and schema in which they are built. This mirrors your data platform's structure that represents the [applied state](/docs/dbt-cloud-apis/project-state) of your project. + + - +## View model versions + +If models in the project are versioned, you can see which [version of the model](/docs/collaborate/govern/model-versions) is being applied — `prerelease`, `latest`, and `old` — in the title of the model’s details page and in the model list from the sidebar. ## View resource details {#view-resource-details} -You can view the definition and latest run results of any resource in your project. To find a resource and view its details, you can interact with the lineage graph, use search, or browse the catalog. The details (metadata) available to you depends on the resource’s type, its definition, and the [commands](/docs/deploy/job-commands) run within jobs in the production environment. +You can view the definition and latest run results of any resource in your project. To find a resource and view its details, you can interact with the lineage graph, use search, or browse the catalog. - +The details (metadata) available to you depends on the resource’s type, its definition, and the [commands](/docs/deploy/job-commands) that run within jobs in the production environment. + ### Example of model details An example of the details you might get for a model: -- **General** — The model’s lineage graph that you can interact with. -- **Code** — The source code and compiled code for the model. -- **Columns** — The available columns in the model. -- **Description** — A [description of the model](/docs/collaborate/documentation#adding-descriptions-to-your-project). -- **Recent** — Information on the last time the model ran, how long it ran for, whether the run was successful, the job ID, and the run ID. -- **Tests** — [Tests](/docs/build/tests) for the model. -- **Details** — Key properties like the model’s relation name (for example, how it’s represented and how you can query it in the data platform: `database.schema.identifier`); model governance attributes like access, group, and if contracted; and more. -- **Relationships** — The nodes the model **Depends On** and is **Referenced by.** +- Status bar (below the page title) — Information on the last time the model ran, whether the run was successful, how the data is materialized, number of rows, and the size of the model. +- **General** tab includes: + - **Lineage** graph — The model’s lineage graph that you can interact with. The graph includes one parent node and one child node from the model. Click the Expand icon in the graph's upper right corner to view the model in full lineage graph mode. + - **Description** section — A [description of the model](/docs/collaborate/documentation#adding-descriptions-to-your-project). + - **Recent** section — Information on the last time the model ran, how long it ran for, whether the run was successful, the job ID, and the run ID. + - **Tests** section — [Tests](/docs/build/tests) for the model. + - **Details** section — Key properties like the model’s relation name (for example, how it’s represented and how you can query it in the data platform: `database.schema.identifier`); model governance attributes like access, group, and if contracted; and more. + - **Relationships** section — The nodes the model **Depends On**, is **Referenced by**, and (if applicable) is **Used by** for projects that have declared the models' project as a dependency. +- **Code** tab — The source code and compiled code for the model. +- **Columns** tab — The available columns in the model. This tab also shows tests results (if any) that you can select to view the test's details page. A :white_check_mark: denotes a passing test. + ### Example of exposure details An example of the details you might get for an exposure: -- **Status** — The status on data freshness and data quality. -- **Lineage** — The exposure’s lineage graph. -- **Description** — A description of the exposure. -- **Details** — Details like exposure type, maturity, owner information, and more. -- **Relationships** — The nodes the exposure **Depends On**. +- Status bar (below the page title) — Information on the last time the exposure was updated. +- **General** tab includes: + - **Status** section — The status on data freshness and data quality. + - **Lineage** graph — The exposure’s lineage graph. Click the Expand icon in the graph's upper right corner to view the exposure in full lineage graph mode. + - **Description** section — A description of the exposure. + - **Details** section — Details like exposure type, maturity, owner information, and more. + - **Relationships** section — The nodes the exposure **Depends On**. ### Example of test details An example of the details you might get for a test: -- **General** — The test’s lineage graph that you can interact with. -- **Code** — The source code and compiled code for the test. -- **Description** — A description of the test. -- **Recent** — Information on the last time the test ran, how long it ran for, whether the test passed, the job ID, and the run ID. -- **Details** — Details like schema, severity, package, and more. -- **Relationships** — The nodes the test **Depends On**. +- Status bar (below the page title) — Information on the last time the test ran, whether the test passed, test name, test target, and column name. +- **General** tab includes: + - **Lineage** graph — The test’s lineage graph that you can interact with. The graph includes one parent node and one child node from the test resource. Click the Expand icon in the graph's upper right corner to view the test in full lineage graph mode. + - **Description** section — A description of the test. + - **Recent** section — Information on the last time the test ran, how long it ran for, whether the test passed, the job ID, and the run ID. + - **Details** section — Details like schema, severity, package, and more. + - **Relationships** section — The nodes the test **Depends On**. +- **Code** tab — The source code and compiled code for the test. + ### Example of source details An example of the details you might get for each source table within a source collection: -- **General** — The source’s lineage graph that you can interact with. -- **Columns** — The available columns in the source. -- **Description** — A description of the source. -- **Source freshness** — Information on whether refreshing the data was successful, the last time the source was loaded, the timestamp of when a run generated data, and the run ID. -- **Details** — Details like database, schema, and more. -- **Relationships** — A table that lists all the sources used with their freshness status, the timestamp of when freshness was last checked, and the timestamp of when the source was last loaded. \ No newline at end of file +- Status bar (below the page title) — Information on the last time the source was updated and the number of tables the source uses. +- **General** tab includes: + - **Lineage** graph — The source’s lineage graph that you can interact with. The graph includes one parent node and one child node from the source. Click the Expand icon in the graph's upper right corner to view the source in full lineage graph mode. + - **Description** section — A description of the source. + - **Source freshness** section — Information on whether refreshing the data was successful, the last time the source was loaded, the timestamp of when a run generated data, and the run ID. + - **Details** section — Details like database, schema, and more. + - **Relationships** section — A table that lists all the sources used with their freshness status, the timestamp of when freshness was last checked, and the timestamp of when the source was last loaded. +- **Columns** tab — The available columns in the source. This tab also shows tests results (if any) that you can select to view the test's details page. A :white_check_mark: denotes a passing test. + +## About project-level lineage +You can also view all the different projects and public models in the account, where the public models are defined, and how they are used to gain a better understanding about your cross-project resources. + +When viewing the resource-level lineage graph for a given project that uses cross-project references, you can see cross-project relationships represented in the DAG. The iconography is slightly different depending on whether you're viewing the lineage of an upstream producer project or a downstream consumer project. + +When viewing an upstream (parent) project that produces public models that are imported by downstream (child) projects, public models will have a counter icon in their upper right corner that indicates the number of projects that declare the current project as a dependency. Selecting that model reveals the lineage to show the specific projects that are dependent on this model. Projects show up in this counter if they declare the parent project as a dependency in its `dependencies.yml` regardless of whether or not there's a direct `{{ ref() }}` against the public model. Selecting a project node from a public model opens the resource-level lineage graph for that project, which is subject to your permissions. + + + +When viewing a downstream (child) project that imports and refs public models from upstream (parent) projects, public models will show up in the lineage graph and display an icon on the graph edge that indicates what the relationship is to a model from another project. Hovering over this icon indicates the specific dbt Cloud project that produces that model. Double-clicking on a model from another project opens the resource-level lineage graph of the parent project, which is subject to your permissions. + + + + +### Explore the project-level lineage graph + +For cross-project collaboration, you can interact with the DAG in all the same ways as described in [Explore your project's lineage](#project-lineage) but you can also interact with it at the project level and view the details. + +To get a list view of all the projects, select the account name at the top of the **Explore** page near the navigation bar. This view includes a public model list, project list, and a search bar for project searches. You can also view the project-level lineage graph by clicking the Lineage view icon in the page's upper right corner. + +If you have permissions for a project in the account, you can view all public models used across the entire account. However, you can only view full public model details and private models if you have permissions for a project where the models are defined. + +From the project-level lineage graph, you can: + +- Click the Lineage view icon (in the graph’s upper right corner) to view the cross-project lineage graph. +- Click the List view icon (in the graph’s upper right corner) to view the project list. + - Select a project from the **Projects** tab to switch to that project’s main **Explore** page. + - Select a model from the **Public Models** tab to view the [model’s details page](#view-resource-details). + - Perform searches on your projects with the search bar. +- Select a project node in the graph (double-clicking) to switch to that particular project’s lineage graph. + +When you select a project node in the graph, a project details panel opens on the graph’s right-hand side where you can: + +- View counts of the resources defined in the project. +- View a list of its public models, if any. +- View a list of other projects that uses the project, if any. +- Click **Open Project Lineage** to switch to the project’s lineage graph. +- Click the Share icon to copy the project panel link to your clipboard so you can share the graph with someone. + + + +## Related content +- [Enterprise permissions](/docs/cloud/manage-access/enterprise-permissions) +- [About model governance](/docs/collaborate/govern/about-model-governance) +- [What is data mesh?](https://www.getdbt.com/blog/what-is-data-mesh-the-definition-and-importance-of-data-mesh) blog diff --git a/website/docs/docs/collaborate/git-version-control.md b/website/docs/docs/collaborate/git-version-control.md index 4444f381bb5..392e2c3baa5 100644 --- a/website/docs/docs/collaborate/git-version-control.md +++ b/website/docs/docs/collaborate/git-version-control.md @@ -3,6 +3,8 @@ title: "About git" id: git-version-control description: "Git overview" sidebar_label: "About git" +pagination_next: "docs/collaborate/git/version-control-basics" +pagination_prev: null --- A [version control](https://en.wikipedia.org/wiki/Version_control) system allows you and your teammates to work collaboratively, safely, and simultaneously on a single project. Version control helps you track all the code changes made in your dbt project. @@ -22,3 +24,4 @@ When you develop in the command line interface (CLI) or Cloud integrated develo - [Merge conflicts](/docs/collaborate/git/merge-conflicts) - [Connect to GitHub](/docs/cloud/git/connect-github) - [Connect to GitLab](/docs/cloud/git/connect-gitlab) +- [Connect to Azure DevOps](/docs/cloud/git/connect-azure-devops) diff --git a/website/docs/docs/collaborate/git/merge-conflicts.md b/website/docs/docs/collaborate/git/merge-conflicts.md index b109cacb511..c3c19b1e2a1 100644 --- a/website/docs/docs/collaborate/git/merge-conflicts.md +++ b/website/docs/docs/collaborate/git/merge-conflicts.md @@ -1,6 +1,7 @@ --- title: "Merge conflicts" id: "merge-conflicts" +pagination_next: null --- [Merge conflicts](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/addressing-merge-conflicts/about-merge-conflicts) in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) often occur when multiple users are simultaneously making edits to the same section in the same file. This makes it difficult for Git to decide what changes to incorporate in the final merge. diff --git a/website/docs/docs/collaborate/govern/about-model-governance.md b/website/docs/docs/collaborate/govern/about-model-governance.md index efeb2836bc6..bbc430845d2 100644 --- a/website/docs/docs/collaborate/govern/about-model-governance.md +++ b/website/docs/docs/collaborate/govern/about-model-governance.md @@ -2,6 +2,8 @@ title: "About model governance" id: about-model-governance description: "Information about new features related to model governance" +pagination_next: "docs/collaborate/govern/model-access" +pagination_prev: null --- diff --git a/website/docs/docs/collaborate/govern/model-access.md b/website/docs/docs/collaborate/govern/model-access.md index 64b70416a2f..765e833ac0c 100644 --- a/website/docs/docs/collaborate/govern/model-access.md +++ b/website/docs/docs/collaborate/govern/model-access.md @@ -25,7 +25,7 @@ The two concepts will be closely related, as we develop multi-project collaborat ## Related documentation * [`groups`](/docs/build/groups) -* [`access`](/reference/resource-properties/access) +* [`access`](/reference/resource-configs/access) ## Groups diff --git a/website/docs/docs/collaborate/govern/project-dependencies.md b/website/docs/docs/collaborate/govern/project-dependencies.md index 1dbc967e74e..9a1d8b59b68 100644 --- a/website/docs/docs/collaborate/govern/project-dependencies.md +++ b/website/docs/docs/collaborate/govern/project-dependencies.md @@ -3,18 +3,17 @@ title: "Project dependencies" id: project-dependencies sidebar_label: "Project dependencies" description: "Reference public models across dbt projects" +pagination_next: null --- -:::caution Closed Beta - dbt Cloud Enterprise -"Project" dependencies and cross-project `ref` are features of dbt Cloud Enterprise, currently in Closed Beta. To access these features while they are in beta, please contact your account team at dbt Labs. +:::info Available in Public Preview for dbt Cloud Enterprise accounts -**Prerequisites:** In order to add project dependencies and resolve cross-project `ref`, you must: -- Have the feature enabled (speak to your account team) -- Use dbt v1.6 for **both** the upstream ("producer") project and the downstream ("consumer") project. -- Have a deployment environment in the upstream ("producer") project [that is set to be your production environment](/docs/deploy/deploy-environments#set-as-production-environment-beta) -- Have a successful run of the upstream ("producer") project +Project dependencies and cross-project `ref` are features available in [dbt Cloud Enterprise](https://www.getdbt.com/pricing), currently in [Public Preview](/docs/dbt-versions/product-lifecycles#dbt-cloud). + +Enterprise users can use these features by designating a [public model](/docs/collaborate/govern/model-access) and adding a [cross-project ref](#how-to-use-ref). ::: + For a long time, dbt has supported code reuse and extension by installing other projects as [packages](/docs/build/packages). When you install another project as a package, you are pulling in its full source code, and adding it to your own. This enables you to call macros and run models defined in that other project. While this is a great way to reuse code, share utility macros, and establish a starting point for common transformations, it's not a great way to enable collaboration across teams and at scale, especially at larger organizations. @@ -23,6 +22,33 @@ This year, dbt Labs is introducing an expanded notion of `dependencies` across m - **Packages** — Familiar and pre-existing type of dependency. You take this dependency by installing the package's full source code (like a software library). - **Projects** — A _new_ way to take a dependency on another project. Using a metadata service that runs behind the scenes, dbt Cloud resolves references on-the-fly to public models defined in other projects. You don't need to parse or run those upstream models yourself. Instead, you treat your dependency on those models as an API that returns a dataset. The maintainer of the public model is responsible for guaranteeing its quality and stability. + +Starting in dbt v1.6 or higher, `packages.yml` has been renamed to `dependencies.yml`. However, if you need use Jinja within your packages config, such as an environment variable for your private package, you need to keep using `packages.yml` for your packages for now. Refer to the [FAQs](#faqs) for more info. + +## Prerequisites + +In order to add project dependencies and resolve cross-project `ref`, you must: +- Use dbt v1.6 or higher for **both** the upstream ("producer") project and the downstream ("consumer") project. +- Have a deployment environment in the upstream ("producer") project [that is set to be your production environment](/docs/deploy/deploy-environments#set-as-production-environment-beta) +- Have a successful run of the upstream ("producer") project +- Have a multi-tenant or single-tenant [dbt Cloud Enterprise](https://www.getdbt.com/pricing) account (Azure ST is not supported but coming soon) + + ## Example As an example, let's say you work on the Marketing team at the Jaffle Shop. The name of your team's project is `jaffle_marketing`: @@ -36,7 +62,7 @@ name: jaffle_marketing As part of your modeling of marketing data, you need to take a dependency on two other projects: -- `dbt_utils` as a [package](#packages-use-case): An collection of utility macros that you can use while writing the SQL for your own models. This package is, open-source public, and maintained by dbt Labs. +- `dbt_utils` as a [package](#packages-use-case): A collection of utility macros that you can use while writing the SQL for your own models. This package is, open-source public, and maintained by dbt Labs. - `jaffle_finance` as a [project use-case](#projects-use-case): Data models about the Jaffle Shop's revenue. This project is private and maintained by your colleagues on the Finance team. You want to select from some of this project's final models, as a starting point for your own work. @@ -66,7 +92,7 @@ When you're building on top of another team's work, resolving the references in - You don't need to mirror any conditional configuration of the upstream project such as `vars`, environment variables, or `target.name`. You can reference them directly wherever the Finance team is building their models in production. Even if the Finance team makes changes like renaming the model, changing the name of its schema, or [bumping its version](/docs/collaborate/govern/model-versions), your `ref` would still resolve successfully. - You eliminate the risk of accidentally building those models with `dbt run` or `dbt build`. While you can select those models, you can't actually build them. This prevents unexpected warehouse costs and permissions issues. This also ensures proper ownership and cost allocation for each team's models. -### Usage +### How to use ref **Writing `ref`:** Models referenced from a `project`-type dependency must use [two-argument `ref`](/reference/dbt-jinja-functions/ref#two-argument-variant), including the project name: @@ -87,6 +113,8 @@ with monthly_revenue as ( **Cycle detection:** Currently, "project" dependencies can only go in one direction, meaning that the `jaffle_finance` project could not add a new model that depends, in turn, on `jaffle_marketing.roi_by_channel`. dbt will check for cycles across projects and raise errors if any are detected. We are considering support for this pattern in the future, whereby dbt would still check for node-level cycles while allowing cycles at the project level. +For more guidance on how to use dbt Mesh, refer to the dedicated [dbt Mesh guide](/guides/best-practices/how-we-mesh/mesh-1-intro). + ### Comparison If you were to instead install the `jaffle_finance` project as a `package` dependency, you would instead be pulling down its full source code and adding it to your runtime environment. This means: @@ -99,4 +127,16 @@ There are a few cases where installing another internal project as a package can - Unified deployments — In a production environment, if the central data platform team of Jaffle Shop wanted to schedule the deployment of models across both `jaffle_finance` and `jaffle_marketing`, they could use dbt's [selection syntax](/reference/node-selection/syntax) to create a new "passthrough" project that installed both projects as packages. - Coordinated changes — In development, if you wanted to test the effects of a change to a public model in an upstream project (`jaffle_finance.monthly_revenue`) on a downstream model (`jaffle_marketing.roi_by_channel`) _before_ introducing changes to a staging or production environment, you can install the `jaffle_finance` package as a package within `jaffle_marketing`. The installation can point to a specific git branch, however, if you find yourself frequently needing to perform end-to-end testing across both projects, we recommend you re-examine if this represents a stable interface boundary. -These are the exceptions, rather than the rule. Installing another team's project as a package adds complexity, latency, and risk of unnecessary costs. By defining clear interface boundaries across teams, by serving one team's public models as "APIs" to another, and by enabling practitioners to develop with a more narrowly-defined scope, we can enable more people to contribute, with more confidence, while requiring less context upfront. +These are the exceptions, rather than the rule. Installing another team's project as a package adds complexity, latency, and risk of unnecessary costs. By defining clear interface boundaries across teams, by serving one team's public models as "APIs" to another, and by enabling practitioners to develop with a more narrowly defined scope, we can enable more people to contribute, with more confidence, while requiring less context upfront. + +## FAQs + +
+Can I define private packages in the dependencies.yml file? + +If you're using private packages with the [git token method](/docs/build/packages#git-token-method), you must define them in the `packages.yml` file instead of the `dependencies.yml` file. This is because conditional rendering (like Jinja-in-yaml) is not supported. +
+ + +## Related docs +- Refer to the [dbt Mesh](/guides/best-practices/how-we-mesh/mesh-1-intro) guide for more guidance on how to use dbt Mesh. diff --git a/website/docs/docs/community-adapters.md b/website/docs/docs/community-adapters.md index 87d1bd4981e..444ea0e04b4 100644 --- a/website/docs/docs/community-adapters.md +++ b/website/docs/docs/community-adapters.md @@ -11,10 +11,10 @@ Community adapters are adapter plugins contributed and maintained by members of | [Clickhouse](/docs/core/connect-data-platform/clickhouse-setup) | [Hive](/docs/core/connect-data-platform/hive-setup) | [Rockset](/docs/core/connect-data-platform/rockset-setup) | | [IBM DB2](/docs/core/connect-data-platform/ibmdb2-setup) | [Impala](/docs/core/connect-data-platform/impala-setup) | [SingleStore](/docs/core/connect-data-platform/singlestore-setup) | | [Doris & SelectDB](/docs/core/connect-data-platform/doris-setup) | [Infer](/docs/core/connect-data-platform/infer-setup) | [SQLite](/docs/core/connect-data-platform/sqlite-setup) | -| [DuckDB](/docs/core/connect-data-platform/duckdb-setup) | [iomete](/docs/core/connect-data-platform/iomete-setup) | [SQL Server & Azure SQL](/docs/core/connect-data-platform/mssql-setup) | -| [Dremio](/docs/core/connect-data-platform/dremio-setup) | [Layer](/docs/core/connect-data-platform/layer-setup) | [Teradata](/docs/core/connect-data-platform/teradata-setup) | -| [Exasol Analytics](/docs/core/connect-data-platform/exasol-setup) | [Materialize](/docs/core/connect-data-platform/materialize-setup) | [TiDB](/docs/core/connect-data-platform/tidb-setup) | -| [Firebolt](/docs/core/connect-data-platform/firebolt-setup) | [MindsDB](/docs/core/connect-data-platform/mindsdb-setup) | [Vertica](/docs/core/connect-data-platform/vertica-setup) | -| [AWS Glue](/docs/core/connect-data-platform/glue-setup) | [MySQL](/docs/core/connect-data-platform/mysql-setup)| [Upsolver](/docs/core/connect-data-platform/upsolver-setup) | -| [Databend Cloud](/docs/core/connect-data-platform/databend-setup) | [fal - Python models](/docs/core/connect-data-platform/fal-setup) | | +| [Starrocks](/docs/core/connect-data-platform/starrocks-setup) | [DuckDB](/docs/core/connect-data-platform/duckdb-setup) | [iomete](/docs/core/connect-data-platform/iomete-setup) +| [SQL Server & Azure SQL](/docs/core/connect-data-platform/mssql-setup) | [Dremio](/docs/core/connect-data-platform/dremio-setup) | [Layer](/docs/core/connect-data-platform/layer-setup) +| [Teradata](/docs/core/connect-data-platform/teradata-setup) | [Exasol Analytics](/docs/core/connect-data-platform/exasol-setup) | [Materialize](/docs/core/connect-data-platform/materialize-setup) +| [TiDB](/docs/core/connect-data-platform/tidb-setup) | [Firebolt](/docs/core/connect-data-platform/firebolt-setup) | [MindsDB](/docs/core/connect-data-platform/mindsdb-setup) +| [Vertica](/docs/core/connect-data-platform/vertica-setup) | [AWS Glue](/docs/core/connect-data-platform/glue-setup) | [MySQL](/docs/core/connect-data-platform/mysql-setup) | +| [Upsolver](/docs/core/connect-data-platform/upsolver-setup) | [Databend Cloud](/docs/core/connect-data-platform/databend-setup) | [fal - Python models](/docs/core/connect-data-platform/fal-setup) | diff --git a/website/docs/docs/connect-adapters.md b/website/docs/docs/connect-adapters.md index f45da732abb..77ead34e51d 100644 --- a/website/docs/docs/connect-adapters.md +++ b/website/docs/docs/connect-adapters.md @@ -11,9 +11,9 @@ This section provides more details on different ways you can connect dbt to an a Explore the fastest and most reliable way to deploy dbt using dbt Cloud, a hosted architecture that runs dbt Core across your organization. dbt Cloud lets you seamlessly [connect](/docs/cloud/about-cloud-setup) with a variety of [verified](/docs/supported-data-platforms) data platform providers directly in the dbt Cloud UI. -### Install using the CLI +### Install with dbt Core -Install dbt Core, which is an open-source tool, locally using the CLI. dbt communicates with a number of different data platforms by using a dedicated adapter plugin for each. When you install dbt Core, you'll also need to install the specific adapter for your database, [connect to dbt Core](/docs/core/about-core-setup), and set up a `profiles.yml` file. +Install dbt Core, an open-source tool, locally using the command line. dbt communicates with a number of different data platforms by using a dedicated adapter plugin for each. When you install dbt Core, you'll also need to install the specific adapter for your database, [connect to dbt Core](/docs/core/about-core-setup), and set up a `profiles.yml` file. With a few exceptions [^1], you can install all [Verified adapters](/docs/supported-data-platforms) from PyPI using `pip install adapter-name`. For example to install Snowflake, use the command `pip install dbt-snowflake`. The installation will include `dbt-core` and any other required dependencies, which may include both other dependencies and even other adapter plugins. Read more about [installing dbt](/docs/core/installation). diff --git a/website/docs/docs/contribute-core-adapters.md b/website/docs/docs/contribute-core-adapters.md index 6e66a5d28ff..553361ee1a2 100644 --- a/website/docs/docs/contribute-core-adapters.md +++ b/website/docs/docs/contribute-core-adapters.md @@ -1,6 +1,7 @@ --- title: "Contribute to adapters" id: "contribute-core-adapters" +pagination_next: null --- The dbt Community exists to allow analytics practitioners share their knowledge, help others and collectively to drive forward the discipline of analytics engineering. There are opportunities here for everyone to contribute whether you're at the beginning your analytics engineering journey or you are a seasoned data professional. diff --git a/website/docs/docs/core/about-core-setup.md b/website/docs/docs/core/about-core-setup.md index 0408e529b2d..a4d5ff09ee3 100644 --- a/website/docs/docs/core/about-core-setup.md +++ b/website/docs/docs/core/about-core-setup.md @@ -3,13 +3,15 @@ title: About dbt Core setup id: about-core-setup description: "Configuration settings for dbt Core." sidebar_label: "About dbt Core setup" +pagination_next: "docs/core/about-dbt-core" +pagination_prev: null --- dbt Core is an [open-source](https://github.com/dbt-labs/dbt-core) tool that enables data teams to transform data using analytics engineering best practices. You can install dbt locally in your environment and use dbt Core on the command line. It can communicate with databases through adapters. This section of our docs will guide you through various settings to get started: -- [About the CLI](/docs/core/about-the-cli) +- [About dbt Core](/docs/core/about-dbt-core) - [Installing dbt](/docs/core/installation) - [Connecting to a data platform](/docs/core/connect-data-platform/profiles.yml) - [How to run your dbt projects](/docs/running-a-dbt-project/run-your-dbt-projects) diff --git a/website/docs/docs/core/about-dbt-core.md b/website/docs/docs/core/about-dbt-core.md new file mode 100644 index 00000000000..a35d92420f3 --- /dev/null +++ b/website/docs/docs/core/about-dbt-core.md @@ -0,0 +1,25 @@ +--- +title: "About dbt Core" +id: "about-dbt-core" +sidebar_label: "About dbt Core" +--- + +[dbt Core](https://github.com/dbt-labs/dbt-core) is an open sourced project where you can develop from the command line and run your dbt project. + +To use dbt Core, your workflow generally looks like: + +1. **Build your dbt project in a code editor —** popular choices include VSCode and Atom. + +2. **Run your project from the command line —** macOS ships with a default Terminal program, however you can also use iTerm or the command line prompt within a code editor to execute dbt commands. + +:::info How we set up our computers for working on dbt projects + +We've written a [guide](https://discourse.getdbt.com/t/how-we-set-up-our-computers-for-working-on-dbt-projects/243) for our recommended setup when running dbt projects using dbt Core. + +::: + +If you're using the command line, we recommend learning some basics of your terminal to help you work more effectively. In particular, it's important to understand `cd`, `ls` and `pwd` to be able to navigate through the directory structure of your computer easily. + +You can find more information on installing and setting up the dbt Core [here](/docs/core/installation). + +**Note** — dbt supports a dbt Cloud CLI and dbt Core, both command line interface tools that enable you to run dbt commands. The key distinction is the dbt Cloud CLI is tailored for dbt Cloud's infrastructure and integrates with all its [features](/docs/cloud/about-cloud/dbt-cloud-features). diff --git a/website/docs/docs/core/about-the-cli.md b/website/docs/docs/core/about-the-cli.md deleted file mode 100644 index d05fb514dfa..00000000000 --- a/website/docs/docs/core/about-the-cli.md +++ /dev/null @@ -1,22 +0,0 @@ ---- -title: "About the CLI" -id: "about-the-cli" -sidebar_label: "About the CLI" ---- - -dbt ships with a command line interface (CLI) for running your dbt project. This way of running dbt and a dbt project is free and open source. - -To use the CLI, your workflow generally looks like: -1. **Build your dbt project in a code editor —** popular choices include VSCode and Atom. - -1. **Run your project from the command line —** macOS ships with a default Terminal program, however you can also use iTerm or the command line prompt within a code editor to execute dbt commands. - -:::info How we set up our computers for working on dbt projects - -We've written a [guide](https://discourse.getdbt.com/t/how-we-set-up-our-computers-for-working-on-dbt-projects/243) for our recommended setup when running dbt projects using the CLI. - -::: - -If you're using the CLI, we recommend learning some basics of your terminal to help you work more effectively. In particular, it's important to understand `cd`, `ls` and `pwd` to be able to navigate through the directory structure of your computer easily. - -You can find more information on installing and setting up the dbt CLI [here](/dbt-cli/cli-overview). diff --git a/website/docs/docs/core/connect-data-platform/about-core-connections.md b/website/docs/docs/core/connect-data-platform/about-core-connections.md index 802e197514c..a85a32cc031 100644 --- a/website/docs/docs/core/connect-data-platform/about-core-connections.md +++ b/website/docs/docs/core/connect-data-platform/about-core-connections.md @@ -4,6 +4,8 @@ id: "about-core-connections" description: "Information about data platform connections in dbt Core" sidebar_label: "About data platform connections in dbt Core" hide_table_of_contents: true +pagination_next: "docs/core/connect-data-platform/profiles.yml" +pagination_prev: null --- dbt Core can connect with a variety of data platform providers including: diff --git a/website/docs/docs/core/connect-data-platform/alloydb-setup.md b/website/docs/docs/core/connect-data-platform/alloydb-setup.md index c3f3ee9cfca..c01ba06d887 100644 --- a/website/docs/docs/core/connect-data-platform/alloydb-setup.md +++ b/website/docs/docs/core/connect-data-platform/alloydb-setup.md @@ -3,7 +3,7 @@ title: "AlloyDB setup" meta: maintained_by: Community? authors: 'dbt-labs' - github_repo: 'dbt-labs/dbt-postgres' + github_repo: 'dbt-labs/dbt-core' pypi_package: 'dbt-postgres' min_core_version: 'v1.0.0' cloud_support: Not Supported diff --git a/website/docs/docs/core/connect-data-platform/bigquery-setup.md b/website/docs/docs/core/connect-data-platform/bigquery-setup.md index 7a2a445be3f..4169b782594 100644 --- a/website/docs/docs/core/connect-data-platform/bigquery-setup.md +++ b/website/docs/docs/core/connect-data-platform/bigquery-setup.md @@ -74,10 +74,10 @@ my-bigquery-db: dev: type: bigquery method: oauth - project: [GCP project id] - dataset: [the name of your dbt dataset] # You can also use "schema" here - threads: [1 or more] - [](#optional-configurations): + project: GCP_PROJECT_ID + dataset: DBT_DATASET_NAME # You can also use "schema" here + threads: 4 # Must be a value of 1 or greater + [OPTIONAL_CONFIG](#optional-configurations): VALUE ```
@@ -90,14 +90,7 @@ If you do not specify a `project`/`database` and are using the `oauth` method, d See [docs](https://developers.google.com/identity/protocols/oauth2) on using OAuth 2.0 to access Google APIs. - - - +#### Refresh token Using the refresh token and client information, dbt will mint new access tokens as necessary. @@ -110,21 +103,19 @@ my-bigquery-db: dev: type: bigquery method: oauth-secrets - project: [GCP project id] - dataset: [the name of your dbt dataset] # You can also use "schema" here - threads: [1 or more] - refresh_token: [token] - client_id: [client id] - client_secret: [client secret] - token_uri: [redirect URI] - [](#optional-configurations): + project: GCP_PROJECT_ID + dataset: DBT_DATASET_NAME # You can also use "schema" here + threads: 4 # Must be a value of 1 or greater + refresh_token: TOKEN + client_id: CLIENT_ID + client_secret: CLIENT_SECRET + token_uri: REDIRECT_URI + [OPTIONAL_CONFIG](#optional-configurations): VALUE ``` - - - +#### Temporary token dbt will use the one-time access token, no questions asked. This approach makes sense if you have an external deployment process that can mint new access tokens and update the profile file accordingly. @@ -137,18 +128,15 @@ my-bigquery-db: dev: type: bigquery method: oauth-secrets - project: [GCP project id] - dataset: [the name of your dbt dataset] # You can also use "schema" here - threads: [1 or more] - token: [temporary access token] # refreshed + updated by external process - [](#optional-configurations): + project: GCP_PROJECT_ID + dataset: DBT_DATASET_NAME # You can also use "schema" here + threads: 4 # Must be a value of 1 or greater + token: TEMPORARY_ACCESS_TOKEN # refreshed + updated by external process + [OPTIONAL_CONFIG](#optional-configurations): VALUE ``` - - - ### Service Account File @@ -161,11 +149,11 @@ my-bigquery-db: dev: type: bigquery method: service-account - project: [GCP project id] - dataset: [the name of your dbt dataset] - threads: [1 or more] - keyfile: [/path/to/bigquery/keyfile.json] - [](#optional-configurations): + project: GCP_PROJECT_ID + dataset: DBT_DATASET_NAME + threads: 4 # Must be a value of 1 or greater + keyfile: /PATH/TO/BIGQUERY/keyfile.json + [OPTIONAL_CONFIG](#optional-configurations): VALUE ``` @@ -189,10 +177,10 @@ my-bigquery-db: dev: type: bigquery method: service-account-json - project: [GCP project id] - dataset: [the name of your dbt dataset] - threads: [1 or more] - [](#optional-configurations): + project: GCP_PROJECT_ID + dataset: DBT_DATASET_NAME + threads: 4 # Must be a value of 1 or greater + [OPTIONAL_CONFIG](#optional-configurations): VALUE # These fields come from the service account json keyfile keyfile_json: diff --git a/website/docs/docs/core/connect-data-platform/profiles.yml.md b/website/docs/docs/core/connect-data-platform/profiles.yml.md index 67b0eb15fbe..97254dda1c4 100644 --- a/website/docs/docs/core/connect-data-platform/profiles.yml.md +++ b/website/docs/docs/core/connect-data-platform/profiles.yml.md @@ -3,7 +3,7 @@ title: "About profiles.yml" id: profiles.yml --- -If you're using dbt from the [command line (CLI)](/docs/core/about-the-cli), you'll need a `profiles.yml` file that contains the connection details for your data platform. When you run dbt from the CLI, it reads your `dbt_project.yml` file to find the `profile` name, and then looks for a profile with the same name in your `profiles.yml` file. This profile contains all the information dbt needs to connect to your data platform. +If you're using [dbt Core](/docs/core/about-dbt-core), you'll need a `profiles.yml` file that contains the connection details for your data platform. When you run dbt Core from the command line, it reads your `dbt_project.yml` file to find the `profile` name, and then looks for a profile with the same name in your `profiles.yml` file. This profile contains all the information dbt needs to connect to your data platform. For detailed info, you can refer to the [Connection profiles](/docs/core/connect-data-platform/connection-profiles). diff --git a/website/docs/docs/core/connect-data-platform/starrocks-setup.md b/website/docs/docs/core/connect-data-platform/starrocks-setup.md new file mode 100644 index 00000000000..e5c1abac037 --- /dev/null +++ b/website/docs/docs/core/connect-data-platform/starrocks-setup.md @@ -0,0 +1,103 @@ +--- +title: "Starrocks setup" +description: "Read this guide to learn about the Starrocks warehouse setup in dbt." +id: "starrocks-setup" +meta: + maintained_by: Starrocks + authors: Astralidea + github_repo: 'StarRocks/starrocks/tree/main/contrib/dbt-connector' + pypi_package: 'dbt-starrocks' + min_core_version: 'v1.6.2' + min_supported_version: 'Starrocks 2.5' + cloud_support: Not Supported + slack_channel_name: '#db-starrocks' + slack_channel_link: 'https://www.getdbt.com/community' + platform_name: 'Starrocks' + config_page: '/reference/resource-configs/starrocks-configs' +--- + +

Overview of {frontMatter.meta.pypi_package}

+ +
    +
  • Maintained by: {frontMatter.meta.maintained_by}
  • +
  • Authors: {frontMatter.meta.authors}
  • +
  • GitHub repo: {frontMatter.meta.github_repo}
  • +
  • PyPI package: {frontMatter.meta.pypi_package}
  • +
  • Slack channel: {frontMatter.meta.slack_channel_name}
  • +
  • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
  • +
  • dbt Cloud support: {frontMatter.meta.cloud_support}
  • +
  • Minimum data platform version: {frontMatter.meta.min_supported_version}
  • +
+ + +

Installing {frontMatter.meta.pypi_package}

+ +pip is the easiest way to install the adapter: + +pip install {frontMatter.meta.pypi_package} + +

Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

+ +

Configuring {frontMatter.meta.pypi_package}

+ +

For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

+ +

For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

+ + +## Authentication Methods + +### User / Password Authentication + +Starrocks can be configured using basic user/password authentication as shown below. + + + +```yaml +my-starrocks-db: + target: dev + outputs: + dev: + type: starrocks + host: localhost + port: 9030 + schema: analytics + + # User/password auth + username: your_starrocks_username + password: your_starrocks_password +``` + + + +#### Description of Profile Fields +| Option | Description | Required? | Example | +|----------|--------------------------------------------------------|-----------|--------------------------------| +| type | The specific adapter to use | Required | `starrocks` | +| host | The hostname to connect to | Required | `192.168.100.28` | +| port | The port to use | Required | `9030` | +| schema | Specify the schema (database) to build models into | Required | `analytics` | +| username | The username to use to connect to the server | Required | `dbt_admin` | +| password | The password to use for authenticating to the server | Required | `correct-horse-battery-staple` | +| version | Let Plugin try to go to a compatible starrocks version | Optional | `3.1.0` | + +## Supported features + +| Starrocks <= 2.5 | Starrocks 2.5 ~ 3.1 | Starrocks >= 3.1 | Feature | +|:----------------:|:--------------------:|:-----------------:|:---------------------------------:| +| ✅ | ✅ | ✅ | Table materialization | +| ✅ | ✅ | ✅ | View materialization | +| ❌ | ❌ | ✅ | Materialized View materialization | +| ❌ | ✅ | ✅ | Incremental materialization | +| ❌ | ✅ | ✅ | Primary Key Model | +| ✅ | ✅ | ✅ | Sources | +| ✅ | ✅ | ✅ | Custom data tests | +| ✅ | ✅ | ✅ | Docs generate | +| ❌ | ❌ | ❌ | Kafka | + +### Notice +1. When StarRocks Version < 2.5, `Create table as` can only set engine='OLAP' and table_type='DUPLICATE' +2. When StarRocks Version >= 2.5, `Create table as` supports table_type='PRIMARY' +3. When StarRocks Version < 3.1 distributed_by is required + +It is recommended to use the latest starrocks version and dbt-starrocks version for the best experience. \ No newline at end of file diff --git a/website/docs/docs/core/connect-data-platform/teradata-setup.md b/website/docs/docs/core/connect-data-platform/teradata-setup.md index 1fe33ff8929..1ba8e506b88 100644 --- a/website/docs/docs/core/connect-data-platform/teradata-setup.md +++ b/website/docs/docs/core/connect-data-platform/teradata-setup.md @@ -4,7 +4,7 @@ description: "Read this guide to learn about the Teradata warehouse setup in dbt id: "teradata-setup" meta: maintained_by: Teradata - authors: Doug Beatty and Adam Tworkiewicz + authors: Teradata github_repo: 'Teradata/dbt-teradata' pypi_package: 'dbt-teradata' min_core_version: 'v0.21.0' @@ -41,6 +41,28 @@ pip is the easiest way to install the adapter:

Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

+

Python compatibility

+ +| Plugin version | Python 3.6 | Python 3.7 | Python 3.8 | Python 3.9 | Python 3.10 | Python 3.11 | +| -------------- | ----------- | ----------- | ----------- | ----------- | ----------- | ------------ | +| 0.19.0.x | ✅ | ✅ | ✅ | ❌ | ❌ | ❌ +| 0.20.0.x | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ +| 0.21.1.x | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ +| 1.0.0.x | ❌ | ✅ | ✅ | ✅ | ❌ | ❌ +|1.1.x.x | ❌ | ✅ | ✅ | ✅ | ✅ | ❌ +|1.2.x.x | ❌ | ✅ | ✅ | ✅ | ✅ | ❌ +|1.3.x.x | ❌ | ✅ | ✅ | ✅ | ✅ | ❌ +|1.4.x.x | ❌ | ✅ | ✅ | ✅ | ✅ | ✅ +|1.5.x | ❌ | ✅ | ✅ | ✅ | ✅ | ✅ +|1.6.x | ❌ | ❌ | ✅ | ✅ | ✅ | ✅ + +

dbt dependent packages version compatibility

+ +| dbt-teradata | dbt-core | dbt-teradata-util | dbt-util | +|--------------|------------|-------------------|----------------| +| 1.2.x | 1.2.x | 0.1.0 | 0.9.x or below | + +

Configuring {frontMatter.meta.pypi_package}

For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

@@ -88,11 +110,15 @@ The plugin also supports the following optional connection parameters: Parameter | Default | Type | Description ----------------------- | ----------- | -------------- | --- `account` | | string | Specifies the database account. Equivalent to the Teradata JDBC Driver `ACCOUNT` connection parameter. +`browser` | | string | Specifies the command to open the browser for Browser Authentication, when logmech is BROWSER. Browser Authentication is supported for Windows and macOS. Equivalent to the Teradata JDBC Driver BROWSER connection parameter. +`browser_tab_timeout` | `"5"` | quoted integer | Specifies the number of seconds to wait before closing the browser tab after Browser Authentication is completed. The default is 5 seconds. The behavior is under the browser's control, and not all browsers support automatic closing of browser tabs. +`browser_timeout` | `"180"` | quoted integer | Specifies the number of seconds that the driver will wait for Browser Authentication to complete. The default is 180 seconds (3 minutes). `column_name` | `"false"` | quoted boolean | Controls the behavior of cursor `.description` sequence `name` items. Equivalent to the Teradata JDBC Driver `COLUMN_NAME` connection parameter. False specifies that a cursor `.description` sequence `name` item provides the AS-clause name if available, or the column name if available, or the column title. True specifies that a cursor `.description` sequence `name` item provides the column name if available, but has no effect when StatementInfo parcel support is unavailable. `connect_failure_ttl` | `"0"` | quoted integer | Specifies the time-to-live in seconds to remember the most recent connection failure for each IP address/port combination. The driver subsequently skips connection attempts to that IP address/port for the duration of the time-to-live. The default value of zero disables this feature. The recommended value is half the database restart time. Equivalent to the Teradata JDBC Driver `CONNECT_FAILURE_TTL` connection parameter. +`connect_timeout` | `"10000"` | quoted integer | Specifies the timeout in milliseconds for establishing a TCP socket connection. Specify 0 for no timeout. The default is 10 seconds (10000 milliseconds). `cop` | `"true"` | quoted boolean | Specifies whether COP Discovery is performed. Equivalent to the Teradata JDBC Driver `COP` connection parameter. `coplast` | `"false"` | quoted boolean | Specifies how COP Discovery determines the last COP hostname. Equivalent to the Teradata JDBC Driver `COPLAST` connection parameter. When `coplast` is `false` or omitted, or COP Discovery is turned off, then no DNS lookup occurs for the coplast hostname. When `coplast` is `true`, and COP Discovery is turned on, then a DNS lookup occurs for a coplast hostname. -`dbs_port` | `"1025"` | quoted integer | Specifies the database port number. Equivalent to the Teradata JDBC Driver `DBS_PORT` connection parameter. +`port` | `"1025"` | quoted integer | Specifies the database port number. Equivalent to the Teradata JDBC Driver `DBS_PORT` connection parameter. `encryptdata` | `"false"` | quoted boolean | Controls encryption of data exchanged between the driver and the database. Equivalent to the Teradata JDBC Driver `ENCRYPTDATA` connection parameter. `fake_result_sets` | `"false"` | quoted boolean | Controls whether a fake result set containing statement metadata precedes each real result set. `field_quote` | `"\""` | string | Specifies a single character string used to quote fields in a CSV file. @@ -102,11 +128,18 @@ Parameter | Default | Type | Description `lob_support` | `"true"` | quoted boolean | Controls LOB support. Equivalent to the Teradata JDBC Driver `LOB_SUPPORT` connection parameter. `log` | `"0"` | quoted integer | Controls debug logging. Somewhat equivalent to the Teradata JDBC Driver `LOG` connection parameter. This parameter's behavior is subject to change in the future. This parameter's value is currently defined as an integer in which the 1-bit governs function and method tracing, the 2-bit governs debug logging, the 4-bit governs transmit and receive message hex dumps, and the 8-bit governs timing. Compose the value by adding together 1, 2, 4, and/or 8. `logdata` | | string | Specifies extra data for the chosen logon authentication method. Equivalent to the Teradata JDBC Driver `LOGDATA` connection parameter. +`logon_timeout` | `"0"` | quoted integer | Specifies the logon timeout in seconds. Zero means no timeout. `logmech` | `"TD2"` | string | Specifies the logon authentication method. Equivalent to the Teradata JDBC Driver `LOGMECH` connection parameter. Possible values are `TD2` (the default), `JWT`, `LDAP`, `KRB5` for Kerberos, or `TDNEGO`. `max_message_body` | `"2097000"` | quoted integer | Specifies the maximum Response Message size in bytes. Equivalent to the Teradata JDBC Driver `MAX_MESSAGE_BODY` connection parameter. `partition` | `"DBC/SQL"` | string | Specifies the database partition. Equivalent to the Teradata JDBC Driver `PARTITION` connection parameter. +`request_timeout` | `"0"` | quoted integer | Specifies the timeout for executing each SQL request. Zero means no timeout. +`retries` | `0` | integer | Allows an adapter to automatically try again when the attempt to open a new connection on the database has a transient, infrequent error. This option can be set using the retries configuration. Default value is 0. The default wait period between connection attempts is one second. retry_timeout (seconds) option allows us to adjust this waiting period. +`runstartup` | "false" | quoted boolean | Controls whether the user's STARTUP SQL request is executed after logon. For more information, refer to User STARTUP SQL Request. Equivalent to the Teradata JDBC Driver RUNSTARTUP connection parameter. If retries is set to 3, the adapter will try to establish a new connection three times if an error occurs. +`sessions` | | quoted integer | Specifies the number of data transfer connections for FastLoad or FastExport. The default (recommended) lets the database choose the appropriate number of connections. Equivalent to the Teradata JDBC Driver SESSIONS connection parameter. `sip_support` | `"true"` | quoted boolean | Controls whether StatementInfo parcel is used. Equivalent to the Teradata JDBC Driver `SIP_SUPPORT` connection parameter. +`sp_spl` | `"true"` | quoted boolean | Controls whether stored procedure source code is saved in the database when a SQL stored procedure is created. Equivalent to the Teradata JDBC Driver SP_SPL connection parameter. `sslca` | | string | Specifies the file name of a PEM file that contains Certificate Authority (CA) certificates for use with `sslmode` values `VERIFY-CA` or `VERIFY-FULL`. Equivalent to the Teradata JDBC Driver `SSLCA` connection parameter. +`sslcrc` | `"ALLOW"` | string | Equivalent to the Teradata JDBC Driver SSLCRC connection parameter. Values are case-insensitive.
• ALLOW provides "soft fail" behavior such that communication failures are ignored during certificate revocation checking.
• REQUIRE mandates that certificate revocation checking must succeed. `sslcapath` | | string | Specifies a directory of PEM files that contain Certificate Authority (CA) certificates for use with `sslmode` values `VERIFY-CA` or `VERIFY-FULL`. Only files with an extension of `.pem` are used. Other files in the specified directory are not used. Equivalent to the Teradata JDBC Driver `SSLCAPATH` connection parameter. `sslcipher` | | string | Specifies the TLS cipher for HTTPS/TLS connections. Equivalent to the Teradata JDBC Driver `SSLCIPHER` connection parameter. `sslmode` | `"PREFER"` | string | Specifies the mode for connections to the database. Equivalent to the Teradata JDBC Driver `SSLMODE` connection parameter.
• `DISABLE` disables HTTPS/TLS connections and uses only non-TLS connections.
• `ALLOW` uses non-TLS connections unless the database requires HTTPS/TLS connections.
• `PREFER` uses HTTPS/TLS connections unless the database does not offer HTTPS/TLS connections.
• `REQUIRE` uses only HTTPS/TLS connections.
• `VERIFY-CA` uses only HTTPS/TLS connections and verifies that the server certificate is valid and trusted.
• `VERIFY-FULL` uses only HTTPS/TLS connections, verifies that the server certificate is valid and trusted, and verifies that the server certificate matches the database hostname. @@ -124,6 +157,91 @@ For the full description of the connection parameters see https://github.com/Ter * `ephemeral` * `incremental` +#### Incremental Materialization +The following incremental materialization strategies are supported: +* `append` (default) +* `delete+insert` +* `merge` + +To learn more about dbt incremental strategies please check [the dbt incremental strategy documentation](https://docs.getdbt.com/docs/build/incremental-models#about-incremental_strategy). + ### Commands All dbt commands are supported. + +## Support for model contracts +Model contracts are not yet supported with dbt-teradata. + +## Support for `dbt-utils` package +`dbt-utils` package is supported through `teradata/teradata_utils` dbt package. The package provides a compatibility layer between `dbt_utils` and `dbt-teradata`. See [teradata_utils](https://hub.getdbt.com/teradata/teradata_utils/latest/) package for install instructions. + +### Cross DB macros +Starting with release 1.3, some macros were migrated from [teradata-dbt-utils](https://github.com/Teradata/dbt-teradata-utils) dbt package to the connector. See the table below for the macros supported from the connector. + +For using cross DB macros, teradata-utils as a macro namespace will not be used, as cross DB macros have been migrated from teradata-utils to Dbt-Teradata. + + +#### Compatibility + +| Macro Group | Macro Name | Status | Comment | +|:---------------------:|:-----------------------------:|:---------------------:|:----------------------------------------------------------------------:| +| Cross-database macros | current_timestamp | :white_check_mark: | custom macro provided | +| Cross-database macros | dateadd | :white_check_mark: | custom macro provided | +| Cross-database macros | datediff | :white_check_mark: | custom macro provided, see [compatibility note](#datediff) | +| Cross-database macros | split_part | :white_check_mark: | custom macro provided | +| Cross-database macros | date_trunc | :white_check_mark: | custom macro provided | +| Cross-database macros | hash | :white_check_mark: | custom macro provided, see [compatibility note](#hash) | +| Cross-database macros | replace | :white_check_mark: | custom macro provided | +| Cross-database macros | type_string | :white_check_mark: | custom macro provided | +| Cross-database macros | last_day | :white_check_mark: | no customization needed, see [compatibility note](#last_day) | +| Cross-database macros | width_bucket | :white_check_mark: | no customization + + +#### examples for cross DB macros + ##### replace + {{ dbt.replace("string_text_column", "old_chars", "new_chars") }} + {{ replace('abcgef', 'g', 'd') }} + + ##### date_trunc + {{ dbt.date_trunc("date_part", "date") }} + {{ dbt.date_trunc("DD", "'2018-01-05 12:00:00'") }} + + ##### datediff + `datediff` macro in teradata supports difference between dates. Differece between timestamps is not supported. + + ##### hash + + `Hash` macro needs an `md5` function implementation. Teradata doesn't support `md5` natively. You need to install a User Defined Function (UDF): + 1. Download the md5 UDF implementation from Teradata (registration required): https://downloads.teradata.com/download/extensibility/md5-message-digest-udf. + 1. Unzip the package and go to `src` directory. + 1. Start up `bteq` and connect to your database. + 1. Create database `GLOBAL_FUNCTIONS` that will host the UDF. You can't change the database name as it's hardcoded in the macro: + ```sql + CREATE DATABASE GLOBAL_FUNCTIONS AS PERMANENT = 60e6, SPOOL = 120e6; + ``` + 1. Create the UDF. Replace `` with your current database user: + ```sql + GRANT CREATE FUNCTION ON GLOBAL_FUNCTIONS TO ; + DATABASE GLOBAL_FUNCTIONS; + .run file = hash_md5.btq + ``` + 1. Grant permissions to run the UDF with grant option. + ```sql + GRANT EXECUTE FUNCTION ON GLOBAL_FUNCTIONS TO PUBLIC WITH GRANT OPTION; + ``` + ##### last_day + + `last_day` in `teradata_utils`, unlike the corresponding macro in `dbt_utils`, doesn't support `quarter` datepart. + +## Limitations + +### Transaction mode +Only ANSI transaction mode is supported. + +## Credits + +The adapter was originally created by [Doug Beatty](https://github.com/dbeatty10). Teradata took over the adapter in January 2022. We are grateful to Doug for founding the project and accelerating the integration of dbt + Teradata. + +## License + +The adapter is published using Apache-2.0 License. Refer to the [terms and conditions](https://github.com/dbt-labs/dbt-core/blob/main/License.md) to understand items such as creating derivative work and the support model. diff --git a/website/docs/docs/core/connect-data-platform/trino-setup.md b/website/docs/docs/core/connect-data-platform/trino-setup.md index 396634dc6e6..39d8ed8ab3f 100644 --- a/website/docs/docs/core/connect-data-platform/trino-setup.md +++ b/website/docs/docs/core/connect-data-platform/trino-setup.md @@ -83,7 +83,7 @@ The following profile fields are optional to set up. They let you configure your | Profile field | Example | Description | | ----------------------------- | -------------------------------- | ----------------------------------------------------------------------------------------------------------- | | `threads` | `8` | How many threads dbt should use (default is `1`) | -| `roles` | `system: analyst` | Catalog roles | +| `roles` | `system: analyst` | Catalog roles can be set under the optional `roles` parameter using the following format: `catalog: role`. | | `session_properties` | `query_max_run_time: 4h` | Sets Trino session properties used in the connection. Execute `SHOW SESSION` to see available options | | `prepared_statements_enabled` | `true` or `false` | Enable usage of Trino prepared statements (used in `dbt seed` commands) (default: `true`) | | `retries` | `10` | Configure how many times all database operation is retried when connection issues arise (default: `3`) | diff --git a/website/docs/docs/core/connect-data-platform/upsolver-setup.md b/website/docs/docs/core/connect-data-platform/upsolver-setup.md index 68cfa3045cd..6b2f410fc07 100644 --- a/website/docs/docs/core/connect-data-platform/upsolver-setup.md +++ b/website/docs/docs/core/connect-data-platform/upsolver-setup.md @@ -14,6 +14,7 @@ meta: slack_channel_link: 'https://join.slack.com/t/upsolvercommunity/shared_invite/zt-1zo1dbyys-hj28WfaZvMh4Z4Id3OkkhA' platform_name: 'Upsolver' config_page: '/reference/resource-configs/upsolver-configs' +pagination_next: null ---

Overview of {frontMatter.meta.pypi_package}

diff --git a/website/docs/docs/core/dbt-core-environments.md b/website/docs/docs/core/dbt-core-environments.md index 5daf17bddf9..c7f340557fd 100644 --- a/website/docs/docs/core/dbt-core-environments.md +++ b/website/docs/docs/core/dbt-core-environments.md @@ -1,6 +1,7 @@ --- title: "dbt Core environments" id: "dbt-core-environments" +pagination_next: "docs/running-a-dbt-project/run-your-dbt-projects" --- dbt makes it easy to maintain separate production and development environments through the use of [targets](/reference/dbt-jinja-functions/target.md) within a [profile](/docs/core/connect-data-platform/profiles.yml). A typical profile, when using dbt locally (for example, running from your command line), will have a target named `dev` and have this set as the default. This means that while making changes, your objects will be built in your _development_ target without affecting production queries made by your end users. Once you are confident in your changes, you can deploy the code to _production_, by running your dbt project with a _prod_ target. diff --git a/website/docs/docs/core/installation-overview.md b/website/docs/docs/core/installation-overview.md index f1fdb800fdf..cb1df26b0f8 100644 --- a/website/docs/docs/core/installation-overview.md +++ b/website/docs/docs/core/installation-overview.md @@ -2,6 +2,8 @@ title: "About installing dbt" id: "installation" description: "You can install dbt Core using a few different tested methods." +pagination_next: "docs/core/homebrew-install" +pagination_prev: null --- You can install dbt Core on the command line by using one of these methods: @@ -11,9 +13,17 @@ You can install dbt Core on the command line by using one of these methods: - [Use a Docker image to install dbt](/docs/core/docker-install) - [Install dbt from source](/docs/core/source-install) +:::tip Pro tip: Using the --help flag + +Most command-line tools, including dbt, have a `--help` flag that you can use to show available commands and arguments. For example, you can use the `--help` flag with dbt in two ways:

+— `dbt --help`: Lists the commands available for dbt
+— `dbt run --help`: Lists the flags available for the `run` command + +::: + ## Upgrading dbt Core -dbt provides a number of resources for understanding [general best practices](/blog/upgrade-dbt-without-fear) while upgrading your dbt project as well as detailed [migration guides](/guides/migration/versions/upgrading-to-v1.4) highlighting the changes required for each minor and major release, and [core versions](/docs/dbt-versions/core) +dbt provides a number of resources for understanding [general best practices](/blog/upgrade-dbt-without-fear) while upgrading your dbt project as well as detailed [migration guides](/docs/dbt-versions/core-upgrade/upgrading-to-v1.4) highlighting the changes required for each minor and major release, and [core versions](/docs/dbt-versions/core) - [Upgrade Homebrew](/docs/core/homebrew-install#upgrading-dbt-and-your-adapter) - [Upgrade `pip`](/docs/core/pip-install#change-dbt-core-versions) diff --git a/website/docs/docs/core/pip-install.md b/website/docs/docs/core/pip-install.md index a35ad5f0d77..44fac00e493 100644 --- a/website/docs/docs/core/pip-install.md +++ b/website/docs/docs/core/pip-install.md @@ -5,7 +5,7 @@ description: "You can use pip to install dbt Core and adapter plugins from the c You need to use `pip` to install dbt Core on Windows or Linux operating systems. You can use `pip` or [Homebrew](/docs/core/homebrew-install) for installing dbt Core on a MacOS. -You can install dbt Core and plugins using `pip` because they are Python modules distributed on [PyPI](https://pypi.org/project/dbt/). +You can install dbt Core and plugins using `pip` because they are Python modules distributed on [PyPI](https://pypi.org/project/dbt-core/). diff --git a/website/docs/docs/core/source-install.md b/website/docs/docs/core/source-install.md index be9918223fe..42086159c03 100644 --- a/website/docs/docs/core/source-install.md +++ b/website/docs/docs/core/source-install.md @@ -1,6 +1,7 @@ --- title: "Install from source" description: "You can install dbt Core from its GitHub code source." +pagination_next: null --- dbt Core and almost all of its adapter plugins are open source software. As such, the codebases are freely available to download and build from source. You might install from source if you want the latest code or want to install dbt from a specific commit. This might be helpful when you are contributing changes, or if you want to debug a past change. diff --git a/website/docs/docs/dbt-cloud-apis/admin-cloud-api.md b/website/docs/docs/dbt-cloud-apis/admin-cloud-api.md index 8a5712f40df..168ec0c80f4 100644 --- a/website/docs/docs/dbt-cloud-apis/admin-cloud-api.md +++ b/website/docs/docs/dbt-cloud-apis/admin-cloud-api.md @@ -1,6 +1,7 @@ --- title: "dbt Cloud Administrative API" id: "admin-cloud-api" +pagination_next: "docs/dbt-cloud-apis/discovery-api" --- The dbt Cloud Administrative API is enabled by default for [Team and Enterprise plans](https://www.getdbt.com/pricing/). It can be used to: diff --git a/website/docs/docs/dbt-cloud-apis/apis-overview.md b/website/docs/docs/dbt-cloud-apis/apis-overview.md index b7d722747d8..eef64992af9 100644 --- a/website/docs/docs/dbt-cloud-apis/apis-overview.md +++ b/website/docs/docs/dbt-cloud-apis/apis-overview.md @@ -2,6 +2,8 @@ title: "APIs Overview" description: "Learn how dbt accounts on the Team and Enterprise plans can query the dbt Cloud APIs." id: "overview" +pagination_next: "docs/dbt-cloud-apis/user-tokens" +pagination_prev: null --- ## Overview diff --git a/website/docs/docs/dbt-cloud-apis/authentication.md b/website/docs/docs/dbt-cloud-apis/authentication.md new file mode 100644 index 00000000000..7deadd68f18 --- /dev/null +++ b/website/docs/docs/dbt-cloud-apis/authentication.md @@ -0,0 +1,22 @@ +--- +title: "Authentication" +description: "Learn how to authenticate with user tokens and service account tokens " +pagination_next: "docs/dbt-cloud-apis/user-tokens" +pagination_prev: null +--- + +
+ + + + + +
\ No newline at end of file diff --git a/website/docs/docs/dbt-cloud-apis/discovery-api.md b/website/docs/docs/dbt-cloud-apis/discovery-api.md index e4441aa55a2..747128cf7bc 100644 --- a/website/docs/docs/dbt-cloud-apis/discovery-api.md +++ b/website/docs/docs/dbt-cloud-apis/discovery-api.md @@ -1,5 +1,6 @@ --- title: "About the Discovery API" +pagination_next: "docs/dbt-cloud-apis/discovery-use-cases-and-examples" --- Every time dbt Cloud runs a project, it generates and stores information about the project. The metadata includes details about your project’s models, sources, and other nodes along with their execution results. With the dbt Cloud Discovery API, you can query this comprehensive information to gain a better understanding of your DAG and the data it produces. diff --git a/website/docs/docs/dbt-cloud-apis/discovery-querying.md b/website/docs/docs/dbt-cloud-apis/discovery-querying.md index ba1365e632b..35c092adb4b 100644 --- a/website/docs/docs/dbt-cloud-apis/discovery-querying.md +++ b/website/docs/docs/dbt-cloud-apis/discovery-querying.md @@ -2,6 +2,7 @@ title: "Query the Discovery API" id: "discovery-querying" sidebar_label: "Query the Discovery API" +pagination_next: "docs/dbt-cloud-apis/discovery-schema-environment" --- The Discovery API supports ad-hoc queries and integrations. If you are new to the API, refer to [About the Discovery API](/docs/dbt-cloud-apis/discovery-api) for an introduction. diff --git a/website/docs/docs/dbt-cloud-apis/schema-discovery-job.mdx b/website/docs/docs/dbt-cloud-apis/schema-discovery-job.mdx index bb30786e19d..8b02c5601ad 100644 --- a/website/docs/docs/dbt-cloud-apis/schema-discovery-job.mdx +++ b/website/docs/docs/dbt-cloud-apis/schema-discovery-job.mdx @@ -2,6 +2,8 @@ title: "Job object schema" sidebar_label: "Job" id: "discovery-schema-job" +pagination_next: "docs/dbt-cloud-apis/discovery-schema-job-model" +pagination_prev: null --- import { QueryArgsTable, SchemaTable } from "./schema"; diff --git a/website/docs/docs/dbt-cloud-apis/sl-api-overview.md b/website/docs/docs/dbt-cloud-apis/sl-api-overview.md index 42416765904..3ddbf76d152 100644 --- a/website/docs/docs/dbt-cloud-apis/sl-api-overview.md +++ b/website/docs/docs/dbt-cloud-apis/sl-api-overview.md @@ -4,6 +4,7 @@ id: sl-api-overview description: "Integrate and query metrics and dimensions in downstream tools using the Semantic Layer APIs" tags: [Semantic Layer, API] hide_table_of_contents: true +pagination_next: "docs/dbt-cloud-apis/sl-jdbc" --- @@ -31,10 +32,8 @@ You can use the dbt Semantic Layer for a variety of tools and applications of da import Features from '/snippets/_sl-plan-info.md'
diff --git a/website/docs/docs/dbt-cloud-apis/sl-jdbc.md b/website/docs/docs/dbt-cloud-apis/sl-jdbc.md index 02d26229794..e10d057dc75 100644 --- a/website/docs/docs/dbt-cloud-apis/sl-jdbc.md +++ b/website/docs/docs/dbt-cloud-apis/sl-jdbc.md @@ -5,7 +5,6 @@ description: "Integrate and use the JDBC API to query your metrics." tags: [Semantic Layer, API] --- - import LegacyInfo from '/snippets/_legacy-sl-callout.md'; @@ -59,11 +58,13 @@ jdbc:arrow-flight-sql://semantic-layer.cloud.getdbt.com:443?&environmentId=20233 ## Querying the API for metric metadata -The Semantic Layer JDBC API has built-in metadata calls which can provide a user with information about their metrics and dimensions. Here are some metadata commands and examples: +The Semantic Layer JDBC API has built-in metadata calls which can provide a user with information about their metrics and dimensions. + +Refer to the following tabs for metadata commands and examples: - + Use this query to fetch all defined metrics in your dbt project: @@ -74,7 +75,7 @@ select * from {{ ``` - + Use this query to fetch all dimensions for a metric. @@ -87,7 +88,7 @@ select * from {{ - + Use this query to fetch dimension values for one or multiple metrics and single dimension. @@ -100,7 +101,7 @@ semantic_layer.dimension_values(metrics=['food_order_amount'], group_by=['custom - + Use this query to fetch queryable granularities for a list of metrics. This API request allows you to only show the time granularities that make sense for the primary time dimension of the metrics (such as `metric_time`), but if you want queryable granularities for other time dimensions, you can use the `dimensions()` call, and find the column queryable_granularities. @@ -113,6 +114,9 @@ select * from {{ + + + @@ -144,9 +148,10 @@ select NAME, QUERYABLE_GRANULARITIES from {{ - + It may be useful in your application to expose the names of the time dimensions that represent `metric_time` or the common thread across all metrics. + You can first query the `metrics()` argument to fetch a list of measures, then use the `measures()` call which will return the name(s) of the time dimensions that make up metric time. ```bash @@ -167,12 +172,13 @@ To query metric values, here are the following parameters that are available: | `metrics` | The metric name as defined in your dbt metric configuration | `metrics=['revenue']` | Required | | `group_by` | Dimension names or entities to group by. We require a reference to the entity of the dimension (other than for the primary time dimension), which is pre-appended to the front of the dimension name with a double underscore. | `group_by=['user__country', 'metric_time']` | Optional | | `grain` | A parameter specific to any time dimension and changes the grain of the data from the default for the metric. | `group_by=[Dimension('metric_time')`
`grain('week\|day\|month\|quarter\|year')]` | Optional | -| `where` | A where clause that allows you to filter on dimensions and entities using parameters - comes with `TimeDimension`, `Dimension`, and `Entity` objects. Granularity is required with `TimeDimension` | `"{{ where=Dimension('customer__country') }} = 'US')"` | Optional | +| `where` | A where clause that allows you to filter on dimensions and entities using parameters. This takes a filter list OR string. Inputs come with `Dimension`, and `Entity` objects. Granularity is required if the `Dimension` is a time dimension | `"{{ where=Dimension('customer__country') }} = 'US')"` | Optional | | `limit` | Limit the data returned | `limit=10` | Optional | -|`order` | Order the data returned | `order_by=['-order_gross_profit']` (remove `-` for ascending order) | Optional | +|`order` | Order the data returned by a particular field | `order_by=['order_gross_profit']`, use `-` for descending, or full object notation if the object is operated on: `order_by=[Metric('order_gross_profit').descending(True)`] | Optional | | `compile` | If true, returns generated SQL for the data platform but does not execute | `compile=True` | Optional | + ## Note on time dimensions and `metric_time` You will notice that in the list of dimensions for all metrics, there is a dimension called `metric_time`. `Metric_time` is a reserved keyword for the measure-specific aggregation time dimensions. For any time-series metric, the `metric_time` keyword should always be available for use in queries. This is a common dimension across *all* metrics in a semantic graph. @@ -246,47 +252,92 @@ select * from {{ Where filters in API allow for a filter list or string. We recommend using the filter list for production applications as this format will realize all benefits from the where possible. -Where filters have the following components that you can use: +Where Filters have a few objects that you can use: -- `Dimension()` - This is used for any categorical or time dimensions. If used for a time dimension, granularity is required - `Dimension('metric_time').grain('week')` or `Dimension('customer__country')` +- `Dimension()` - Used for any categorical or time dimensions. If used for a time dimension, granularity is required - `Dimension('metric_time').grain('week')` or `Dimension('customer__country')` -- `TimeDimension()` - This is used for all time dimensions and requires a granularity argument - `TimeDimension('metric_time', 'MONTH)` +- `Entity()` - Used for entities like primary and foreign keys - `Entity('order_id')` -- `Entity()` - This is used for entities like primary and foreign keys - `Entity('order_id')` +Note: If you prefer a more explicit path to create the `where` clause, you can optionally use the `TimeDimension` feature. This helps separate out categorical dimensions from time-related ones. The `TimeDimesion` input takes the time dimension name and also requires granularity, like this: `TimeDimension('metric_time', 'MONTH')`. -Use the following example to query using a `where` filter with the string format: +- Use the following example to query using a `where` filter with the string format: ```bash select * from {{ semantic_layer.query(metrics=['food_order_amount', 'order_gross_profit'], group_by=[Dimension('metric_time').grain('month'),'customer__customer_type'], -where="{{ TimeDimension('metric_time', 'MONTH') }} >= '2017-03-09' AND {{ Dimension('customer__customer_type' }} in ('new') AND {{ Entity('order_id') }} = 10") +where="{{ Dimension('metric_time').grain('month') }} >= '2017-03-09' AND {{ Dimension('customer__customer_type' }} in ('new') AND {{ Entity('order_id') }} = 10") }} ``` -Use the following example to query using a `where` filter with a filter list format: +- (Recommended for better performance) Use the following example to query using a `where` filter with a filter list format: ```bash select * from {{ semantic_layer.query(metrics=['food_order_amount', 'order_gross_profit'], group_by=[Dimension('metric_time').grain('month'),'customer__customer_type'], -where=[{{ TimeDimension('metric_time', 'MONTH')}} >= '2017-03-09', {{ Dimension('customer__customer_type' }} in ('new'), {{ Entity('order_id') }} = 10]) +where=["{{ Dimension('metric_time').grain('month') }} >= '2017-03-09'", "{{ Dimension('customer__customer_type' }} in ('new')", "{{ Entity('order_id') }} = 10"] }} ``` -### Query with a limit and order by +### Query with a limit Use the following example to query using a `limit` or `order_by` clauses: +```bash +select * from {{ +semantic_layer.query(metrics=['food_order_amount', 'order_gross_profit'], + group_by=[Dimension('metric_time')], + limit=10) + }} +``` +### Query with Order By Examples + +Order By can take a basic string that's a Dimension, Metric, or Entity and this will default to ascending order + ```bash select * from {{ semantic_layer.query(metrics=['food_order_amount', 'order_gross_profit'], group_by=[Dimension('metric_time')], limit=10, - order_by=['order_gross_profit']) + order_by=['order_gross_profit'] }} ``` + +For descending order, you can add a `-` sign in front of the object. However, you can only use this short hand notation if you aren't operating on the object or using the full object notation. + +```bash +select * from {{ +semantic_layer.query(metrics=['food_order_amount', 'order_gross_profit'], + group_by=[Dimension('metric_time')], + limit=10, + order_by=[-'order_gross_profit'] + }} +``` +If you are ordering by an object that's been operated on (e.g., change granularity), or you are using the full object notation, descending order must look like: + +```bash +select * from {{ +semantic_layer.query(metrics=['food_order_amount', 'order_gross_profit'], + group_by=[Dimension('metric_time').grain('week')], + limit=10, + order_by=[Metric('order_gross_profit').descending(True), Dimension('metric_time').grain('week').descending(True) ] + }} +``` + +Similarly, this will yield ascending order: + +```bash +select * from {{ +semantic_layer.query(metrics=['food_order_amount', 'order_gross_profit'], + group_by=[Dimension('metric_time').grain('week')], + limit=10, + order_by=[Metric('order_gross_profit'), Dimension('metric_time').grain('week')] + }} +``` + + ### Query with compile keyword Use the following example to query using a `compile` keyword: diff --git a/website/docs/docs/dbt-cloud-apis/sl-manifest.md b/website/docs/docs/dbt-cloud-apis/sl-manifest.md index 47304accea3..6ecac495869 100644 --- a/website/docs/docs/dbt-cloud-apis/sl-manifest.md +++ b/website/docs/docs/dbt-cloud-apis/sl-manifest.md @@ -4,6 +4,7 @@ id: sl-manifest description: "Learn about the semantic manifest.json file and how you can use artifacts to gain insights about your dbt Semantic Layer." tags: [Semantic Layer, APIs] sidebar_label: "Semantic manifest" +pagination_next: null --- diff --git a/website/docs/docs/dbt-cloud-apis/user-tokens.md b/website/docs/docs/dbt-cloud-apis/user-tokens.md index e56d8b2f974..77e536b12a5 100644 --- a/website/docs/docs/dbt-cloud-apis/user-tokens.md +++ b/website/docs/docs/dbt-cloud-apis/user-tokens.md @@ -1,6 +1,7 @@ --- title: "User tokens" id: "user-tokens" +pagination_next: "docs/dbt-cloud-apis/service-tokens" --- ## User API tokens @@ -13,7 +14,7 @@ permissions of the user the that they were created for. You can find your User API token in the Profile page under the `API Access` label. - + ## FAQs diff --git a/website/docs/docs/dbt-cloud-environments.md b/website/docs/docs/dbt-cloud-environments.md index f61ec5ef72b..8fa4522d47c 100644 --- a/website/docs/docs/dbt-cloud-environments.md +++ b/website/docs/docs/dbt-cloud-environments.md @@ -2,9 +2,10 @@ title: "dbt Cloud environments" id: "dbt-cloud-environments" description: "Learn about dbt Cloud's development environment to execute your project in the IDE" +pagination_next: null --- -An environment determines how dbt Cloud will execute your project in both the dbt Cloud IDE (for development) and scheduled jobs (for deployment). +An environment determines how dbt Cloud will execute your project in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) or [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) (for development) and scheduled jobs (for deployment). Critically, in order to execute dbt, environments define three variables: @@ -34,7 +35,7 @@ To create a new dbt Cloud development environment: ### Set developer credentials -To use the IDE, each developer will need to set up [personal development credentials](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud#access-the-cloud-ide) to your warehouse connection in their **Profile Settings**. This allows you to set separate target information and maintain individual credentials to connect to your warehouse via the dbt Cloud IDE. +To use the dbt Cloud IDE or dbt Cloud CLI, each developer will need to set up [personal development credentials](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud#access-the-cloud-ide) to your warehouse connection in their **Profile Settings**. This allows you to set separate target information and maintain individual credentials to connect to your warehouse. diff --git a/website/docs/docs/dbt-support.md b/website/docs/docs/dbt-support.md index f63e016b03e..513d5fff588 100644 --- a/website/docs/docs/dbt-support.md +++ b/website/docs/docs/dbt-support.md @@ -1,6 +1,8 @@ --- title: "dbt support" id: "dbt-support" +pagination_next: null +pagination_prev: null --- ## dbt Core support diff --git a/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md b/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md new file mode 100644 index 00000000000..9ebd3c64cf3 --- /dev/null +++ b/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md @@ -0,0 +1,70 @@ +--- +title: "Upgrading to v1.7 (latest)" +id: upgrading-to-v1.7 +description: New features and changes in dbt Core v1.7 +displayed_sidebar: "docs" +--- + +import UpgradeMove from '/snippets/_upgrade-move.md'; + + + +## Resources + +- [Changelog](https://github.com/dbt-labs/dbt-core/blob/8aaed0e29f9560bc53d9d3e88325a9597318e375/CHANGELOG.md) +- [CLI Installation guide](/docs/core/installation) +- [Cloud upgrade guide](/docs/dbt-versions/upgrade-core-in-cloud) +- [Release schedule](https://github.com/dbt-labs/dbt-core/issues/8260) + +## What to know before upgrading + +dbt Labs is committed to providing backward compatibility for all versions 1.x, with the exception of any changes explicitly mentioned below. If you encounter an error upon upgrading, please let us know by [opening an issue](https://github.com/dbt-labs/dbt-core/issues/new). + +### Behavior changes + +dbt Core v1.7 expands the amount of sources you can configure freshness for. Previously, freshness was limited to sources with a `loaded_at_field`; now, freshness can be generated from warehouse metadata tables when available. + +As part of this change, the `loaded_at_field` is no longer required to generate source freshness. If a source has a `freshness:` block, dbt will attempt to calculate freshness for that source: +- If a `loaded_at_field` is provided, dbt will calculate freshness via a select query (previous behavior). +- If a `loaded_at_field` is _not_ provided, dbt will calculate freshness via warehouse metadata tables when possible (new behavior). + +This is a relatively small behavior change, but worth calling out in case you notice that dbt is calculating freshness for _more_ sources than before. To exclude a source from freshness calculations, you have two options: +- Don't add a `freshness:` block. +- Explicitly set `freshness: null` + +## New and changed features and functionality + +- [`dbt docs generate`](/reference/commands/cmd-docs) now supports `--select` to generate [catalog metadata](/reference/artifacts/catalog-json) for a subset of your project. Currently available for Snowflake and Postgres only, but other adapters are coming soon. +- [Source freshness](/docs/deploy/source-freshness) can now be generated from warehouse metadata tables, currently Snowflake only, but other adapters that have metadata tables are coming soon. + +### MetricFlow enhancements + +- Automatically create metrics on measures with [`create_metric: true`](/docs/build/semantic-models). +- Optional [`label`](/docs/build/semantic-models) in semantic_models, measures, dimensions and entities. +- New configurations for semantic models - [enable/disable](/reference/resource-configs/enabled), [group](/reference/resource-configs/group), and [meta](/reference/resource-configs/meta). +- Support `fill_nulls_with` and `join_to_timespine` for metric nodes. +- `saved_queries` extends governance beyond the semantic objects to their consumption. + +### For consumers of dbt artifacts (metadata) + +- The [manifest](/reference/artifacts/manifest-json) schema version has been updated to v11. +- The [run_results](/reference/artifacts/run-results-json) schema version has been updated to v5. +- There are a few specific changes to the [catalog.json](/reference/artifacts/catalog-json): + - Added [node attributes](/reference/artifacts/run-results-json) related to compilation (`compiled`, `compiled_code`, `relation_name`) to the `catalog.json`. + - The nodes dictionary in the `catalog.json` can now be "partial" if `dbt docs generate` is run with a selector. + +### Model governance + +dbt Core v1.5 introduced model governance which we're continuing to refine. v1.7 includes these additional features and functionality: + +- **[Breaking change detection](/reference/resource-properties/versions#detecting-breaking-changes) for models with contracts enforced:** When dbt detects a breaking change to a model with an enforced contract during state comparison, it will now raise an error for versioned models and a warning for models that are not versioned. +- **[Set `access` as a config](/reference/resource-configs/access):** You can now set a model's `access` within config blocks in the model's file or in the `dbt_project.yml` for an entire subfolder at once. +- **[Type aliasing for model contracts](/reference/resource-configs/contract):** dbt will use each adapter's built-in type aliasing for user-provided data types—meaning you can now write `string` always, and dbt will translate to `text` on Postgres/Redshift. This is "on" by default, but you can opt-out. +- **[Raise warning for numeric types](/reference/resource-configs/contract):** Because of issues when putting `numeric` in model contracts without considering that default values such as `numeric(38,0)` might round decimals accordingly. dbt will now warn you if it finds a numeric type without specified precision/scale. + +### Quick hits + +With these quick hits, you can now: +- Configure a [`delimiter`](/reference/resource-configs/delimiter) for a seed file. +- Use packages with the same git repo and unique subdirectory. +- Access the `date_spine` macro directly from dbt-core (moved over from dbt-utils). diff --git a/website/docs/guides/migration/versions/01-upgrading-to-v1.6.md b/website/docs/docs/dbt-versions/core-upgrade/01-upgrading-to-v1.6.md similarity index 93% rename from website/docs/guides/migration/versions/01-upgrading-to-v1.6.md rename to website/docs/docs/dbt-versions/core-upgrade/01-upgrading-to-v1.6.md index bdb47bbf2ea..f62b6308ce6 100644 --- a/website/docs/guides/migration/versions/01-upgrading-to-v1.6.md +++ b/website/docs/docs/dbt-versions/core-upgrade/01-upgrading-to-v1.6.md @@ -1,8 +1,14 @@ --- -title: "Upgrading to v1.6 (latest)" +title: "Upgrading to v1.6" description: New features and changes in dbt Core v1.6 +id: "upgrading-to-v1.6" +displayed_sidebar: "docs" --- +import UpgradeMove from '/snippets/_upgrade-move.md'; + + + dbt Core v1.6 has three significant areas of focus: 1. Next milestone of [multi-project deployments](https://github.com/dbt-labs/dbt-core/discussions/6725): improvements to contracts, groups/access, versions; and building blocks for cross-project `ref` 1. Semantic layer re-launch: dbt Core and [MetricFlow](https://docs.getdbt.com/docs/build/about-metricflow) integration @@ -59,7 +65,7 @@ Supported on: - [Postgres](/reference/resource-configs/postgres-configs#materialized-view) - [Redshift](/reference/resource-configs/redshift-configs#materialized-view) - [Snowflake](/reference/resource-configs/snowflake-configs#dynamic-tables) -- Databricks (docs forthcoming) +- [Databricks](/reference/resource-configs/databricks-configs#materialized-views-and-streaming-tables) Support for BigQuery coming soon. @@ -90,4 +96,5 @@ More consistency and flexibility around packages. Resources defined in a package - [`dbt debug --connection`](/reference/commands/debug) to test just the data platform connection specified in a profile - [`dbt docs generate --empty-catalog`](/reference/commands/cmd-docs) to skip catalog population while generating docs - [`--defer-state`](/reference/node-selection/defer) enables more-granular control +- [`dbt ls`](/reference/commands/list) adds the Semantic model selection method to allow for `dbt ls -s "semantic_model:*"` and the ability to execute `dbt ls --resource-type semantic_model`. diff --git a/website/docs/guides/migration/versions/02-upgrading-to-v1.5.md b/website/docs/docs/dbt-versions/core-upgrade/02-upgrading-to-v1.5.md similarity index 95% rename from website/docs/guides/migration/versions/02-upgrading-to-v1.5.md rename to website/docs/docs/dbt-versions/core-upgrade/02-upgrading-to-v1.5.md index 0c7fc7ebcad..dded8a690fe 100644 --- a/website/docs/guides/migration/versions/02-upgrading-to-v1.5.md +++ b/website/docs/docs/dbt-versions/core-upgrade/02-upgrading-to-v1.5.md @@ -1,8 +1,14 @@ --- title: "Upgrading to v1.5" description: New features and changes in dbt Core v1.5 +id: "upgrading-to-v1.5" +displayed_sidebar: "docs" --- +import UpgradeMove from '/snippets/_upgrade-move.md'; + + + dbt Core v1.5 is a feature release, with two significant additions: 1. [**Model governance**](/docs/collaborate/govern/about-model-governance) — access, contracts, versions — the first phase of [multi-project deployments](https://github.com/dbt-labs/dbt-core/discussions/6725) 2. A Python entry point for [**programmatic invocations**](/reference/programmatic-invocations), at parity with the CLI @@ -148,4 +154,4 @@ Run `dbt --help` to see new & improved help documentation :) - The [`version: 2` top-level key](/reference/project-configs/version) is now **optional** in all YAML files. Also, the [`config-version: 2`](/reference/project-configs/config-version) and `version:` top-level keys are now optional in `dbt_project.yml` files. - [Events and logging](/reference/events-logging): Added `node_relation` (`database`, `schema`, `identifier`) to the `node_info` dictionary, available on node-specific events - Support setting `--project-dir` via environment variable: [`DBT_PROJECT_DIR`](/reference/dbt_project.yml) -- More granular [configurations](/reference/global-configs/about-global-configs) for logging (to set log format, log levels, and colorization) and cache population +- More granular configurations for logging (to set [log format](/reference/global-configs/logs#log-formatting), [log levels](/reference/global-configs/logs#log-level), and [colorization](/reference/global-configs/logs#color)) and [cache population](/reference/global-configs/cache#cache-population) diff --git a/website/docs/guides/migration/versions/03-upgrading-to-dbt-utils-v1.0.md b/website/docs/docs/dbt-versions/core-upgrade/03-upgrading-to-dbt-utils-v1.0.md similarity index 99% rename from website/docs/guides/migration/versions/03-upgrading-to-dbt-utils-v1.0.md rename to website/docs/docs/dbt-versions/core-upgrade/03-upgrading-to-dbt-utils-v1.0.md index 72c6fc3c968..a7b302c9a58 100644 --- a/website/docs/guides/migration/versions/03-upgrading-to-dbt-utils-v1.0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/03-upgrading-to-dbt-utils-v1.0.md @@ -3,6 +3,10 @@ title: "Upgrading to dbt utils v1.0" description: New features and breaking changes to consider as you upgrade to dbt utils v1.0. --- +import UpgradeMove from '/snippets/_upgrade-move.md'; + + + # Upgrading to dbt utils v1.0 For the first time, [dbt utils](https://hub.getdbt.com/dbt-labs/dbt_utils/latest/) is crossing the major version boundary. From [last month’s blog post](https://www.getdbt.com/blog/announcing-dbt-v1.3-and-utils/): diff --git a/website/docs/guides/migration/versions/04-upgrading-to-v1.4.md b/website/docs/docs/dbt-versions/core-upgrade/04-upgrading-to-v1.4.md similarity index 97% rename from website/docs/guides/migration/versions/04-upgrading-to-v1.4.md rename to website/docs/docs/dbt-versions/core-upgrade/04-upgrading-to-v1.4.md index 3537eb1677a..6c6d96b2326 100644 --- a/website/docs/guides/migration/versions/04-upgrading-to-v1.4.md +++ b/website/docs/docs/dbt-versions/core-upgrade/04-upgrading-to-v1.4.md @@ -1,7 +1,14 @@ --- title: "Upgrading to v1.4" description: New features and changes in dbt Core v1.4 +id: "upgrading-to-v1.4" +displayed_sidebar: "docs" --- + +import UpgradeMove from '/snippets/_upgrade-move.md'; + + + ### Resources - [Changelog](https://github.com/dbt-labs/dbt-core/blob/1.4.latest/CHANGELOG.md) diff --git a/website/docs/guides/migration/versions/05-upgrading-to-v1.3.md b/website/docs/docs/dbt-versions/core-upgrade/05-upgrading-to-v1.3.md similarity index 97% rename from website/docs/guides/migration/versions/05-upgrading-to-v1.3.md rename to website/docs/docs/dbt-versions/core-upgrade/05-upgrading-to-v1.3.md index 5fdf559a267..f66d9bb9706 100644 --- a/website/docs/guides/migration/versions/05-upgrading-to-v1.3.md +++ b/website/docs/docs/dbt-versions/core-upgrade/05-upgrading-to-v1.3.md @@ -1,7 +1,14 @@ --- title: "Upgrading to v1.3" description: New features and changes in dbt Core v1.3 +id: "upgrading-to-v1.3" +displayed_sidebar: "docs" --- + +import UpgradeMove from '/snippets/_upgrade-move.md'; + + + ### Resources - [Changelog](https://github.com/dbt-labs/dbt-core/blob/1.3.latest/CHANGELOG.md) diff --git a/website/docs/guides/migration/versions/06-upgrading-to-v1.2.md b/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.2.md similarity index 96% rename from website/docs/guides/migration/versions/06-upgrading-to-v1.2.md rename to website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.2.md index 91ffadf9093..16825ff4e2b 100644 --- a/website/docs/guides/migration/versions/06-upgrading-to-v1.2.md +++ b/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.2.md @@ -1,7 +1,14 @@ --- title: "Upgrading to v1.2" description: New features and changes in dbt Core v1.2 +id: "upgrading-to-v1.2" +displayed_sidebar: "docs" --- + +import UpgradeMove from '/snippets/_upgrade-move.md'; + + + ### Resources - [Changelog](https://github.com/dbt-labs/dbt-core/blob/1.2.latest/CHANGELOG.md) diff --git a/website/docs/guides/migration/versions/07-upgrading-to-v1.1.md b/website/docs/docs/dbt-versions/core-upgrade/07-upgrading-to-v1.1.md similarity index 97% rename from website/docs/guides/migration/versions/07-upgrading-to-v1.1.md rename to website/docs/docs/dbt-versions/core-upgrade/07-upgrading-to-v1.1.md index 131ecc97657..7819709558e 100644 --- a/website/docs/guides/migration/versions/07-upgrading-to-v1.1.md +++ b/website/docs/docs/dbt-versions/core-upgrade/07-upgrading-to-v1.1.md @@ -1,7 +1,14 @@ --- title: "Upgrading to v1.1" description: New features and changes in dbt Core v1.1 +id: "upgrading-to-v1.1" +displayed_sidebar: "docs" --- + +import UpgradeMove from '/snippets/_upgrade-move.md'; + + + ### Resources - [Changelog](https://github.com/dbt-labs/dbt-core/blob/1.1.latest/CHANGELOG.md) diff --git a/website/docs/guides/migration/versions/08-upgrading-to-v1.0.md b/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md similarity index 98% rename from website/docs/guides/migration/versions/08-upgrading-to-v1.0.md rename to website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md index 9fc7991c087..7c67a1849a1 100644 --- a/website/docs/guides/migration/versions/08-upgrading-to-v1.0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md @@ -1,7 +1,14 @@ --- title: "Upgrading to v1.0" description: New features and changes in dbt Core v1.0 +id: "upgrading-to-v1.0" +displayed_sidebar: "docs" --- + +import UpgradeMove from '/snippets/_upgrade-move.md'; + + + ### Resources - [Discourse](https://discourse.getdbt.com/t/3180) diff --git a/website/docs/guides/migration/versions/09-upgrading-to-v0.21.md b/website/docs/docs/dbt-versions/core-upgrade/09-upgrading-to-v0.21.md similarity index 97% rename from website/docs/guides/migration/versions/09-upgrading-to-v0.21.md rename to website/docs/docs/dbt-versions/core-upgrade/09-upgrading-to-v0.21.md index e5fbdf3fc7c..d5b429132cd 100644 --- a/website/docs/guides/migration/versions/09-upgrading-to-v0.21.md +++ b/website/docs/docs/dbt-versions/core-upgrade/09-upgrading-to-v0.21.md @@ -1,8 +1,15 @@ --- title: "Upgrading to v0.21" +id: "upgrading-to-v0.21" +displayed_sidebar: "docs" --- +import UpgradeMove from '/snippets/_upgrade-move.md'; + + + + :::caution Unsupported version dbt Core v0.21 has reached the end of critical support. No new patch versions will be released, and it will stop running in dbt Cloud on June 30, 2022. Read ["About dbt Core versions"](/docs/dbt-versions/core) for more details. ::: diff --git a/website/docs/guides/migration/versions/10-upgrading-to-v0.20.md b/website/docs/docs/dbt-versions/core-upgrade/10-upgrading-to-v0.20.md similarity index 96% rename from website/docs/guides/migration/versions/10-upgrading-to-v0.20.md rename to website/docs/docs/dbt-versions/core-upgrade/10-upgrading-to-v0.20.md index 8b33bfa3879..61a7120370a 100644 --- a/website/docs/guides/migration/versions/10-upgrading-to-v0.20.md +++ b/website/docs/docs/dbt-versions/core-upgrade/10-upgrading-to-v0.20.md @@ -1,8 +1,13 @@ --- title: "Upgrading to v0.20" - +id: "upgrading-to-v0.20" +displayed_sidebar: "docs" --- +import UpgradeMove from '/snippets/_upgrade-move.md'; + + + :::caution Unsupported version dbt Core v0.20 has reached the end of critical support. No new patch versions will be released, and it will stop running in dbt Cloud on June 30, 2022. Read ["About dbt Core versions"](/docs/dbt-versions/core) for more details. ::: diff --git a/website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-11-0.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-11-0.md similarity index 95% rename from website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-11-0.md rename to website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-11-0.md index e307c46fdf9..e91dde4c923 100644 --- a/website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-11-0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-11-0.md @@ -1,8 +1,13 @@ --- title: "Upgrading to 0.11.0" id: "upgrading-to-0-11-0" +displayed_sidebar: "docs" --- +import UpgradeMove from '/snippets/_upgrade-move.md'; + + + ## Schema.yml v2 syntax dbt v0.11.0 adds an auto-generated docs site to your dbt project. To make effective use of the documentation site, you'll need to use the new "version 2" schema.yml syntax. For a full explanation of the version 2 syntax, check out the [schema.yml Files](/reference/configs-and-properties) section of the documentation. diff --git a/website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-12-0.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-12-0.md similarity index 76% rename from website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-12-0.md rename to website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-12-0.md index 60900d3c1a4..b3d4e9d9bcb 100644 --- a/website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-12-0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-12-0.md @@ -1,8 +1,13 @@ --- title: "Upgrading to 0.12.0" id: "upgrading-to-0-12-0" +displayed_sidebar: "docs" --- +import UpgradeMove from '/snippets/_upgrade-move.md'; + + + ## End of support Support for the `repositories:` block in `dbt_project.yml` (deprecated in 0.10.0) was removed. diff --git a/website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-13-0.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-13-0.md similarity index 94% rename from website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-13-0.md rename to website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-13-0.md index 14a70e177e8..bb15d1a73b0 100644 --- a/website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-13-0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-13-0.md @@ -1,8 +1,13 @@ --- title: "Upgrading to 0.13.0" id: "upgrading-to-0-13-0" +displayed_sidebar: "docs" --- +import UpgradeMove from '/snippets/_upgrade-move.md'; + + + ## Breaking changes ### on-run-start and on-run-end diff --git a/website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-14-0.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-14-0.md similarity index 99% rename from website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-14-0.md rename to website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-14-0.md index 3b9c8560230..036a9a2aedf 100644 --- a/website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-14-0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-14-0.md @@ -1,8 +1,13 @@ --- title: "Upgrading to 0.14.0" id: "upgrading-to-0-14-0" +displayed_sidebar: "docs" --- +import UpgradeMove from '/snippets/_upgrade-move.md'; + + + This guide outlines migration instructions for: 1. [Upgrading archives to snapshots](#upgrading-to-snapshot-blocks) diff --git a/website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-14-1.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-14-1.md similarity index 98% rename from website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-14-1.md rename to website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-14-1.md index a81740d5a68..215385acf0f 100644 --- a/website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-14-1.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-14-1.md @@ -1,8 +1,13 @@ --- title: "Upgrading to 0.14.1" id: "upgrading-to-0-14-1" +displayed_sidebar: "docs" --- +import UpgradeMove from '/snippets/_upgrade-move.md'; + + + The dbt v0.14.1 release _does not_ contain any breaking code changes for users upgrading from v0.14.0. If you are upgrading from a version less than 0.14.0, consult the [Upgrading to 0.14.0](upgrading-to-0-14-0) migration guide. The following section contains important information for users of the `check` strategy on Snowflake and BigQuery. Action may be required in your database. ## Changes to the Snapshot "check" algorithm diff --git a/website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-15-0.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-15-0.md similarity index 93% rename from website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-15-0.md rename to website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-15-0.md index 02ab297c07a..6dd2b6fb9eb 100644 --- a/website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-15-0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-15-0.md @@ -1,10 +1,16 @@ --- title: "Upgrading to 0.15.0" id: "upgrading-to-0-15-0" +displayed_sidebar: "docs" --- +import UpgradeMove from '/snippets/_upgrade-move.md'; + + + The dbt v0.15.0 release contains a handful of breaking code changes for users upgrading from v0.14.0. + ## Breaking changes ### Stricter YML compilation diff --git a/website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-16-0.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-16-0.md similarity index 98% rename from website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-16-0.md rename to website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-16-0.md index a34f23c4c89..076e6fc4e88 100644 --- a/website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-16-0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-16-0.md @@ -1,8 +1,13 @@ --- title: "Upgrading to 0.16.0" id: "upgrading-to-0-16-0" +displayed_sidebar: "docs" --- +import UpgradeMove from '/snippets/_upgrade-move.md'; + + + dbt v0.16.0 contains many new features, bug fixes, and improvements. This guide covers all of the important information to consider when upgrading from an earlier version of dbt to 0.16.0. diff --git a/website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-17-0.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-17-0.md similarity index 98% rename from website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-17-0.md rename to website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-17-0.md index 1f891ebc0f4..5b863777df9 100644 --- a/website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-17-0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-17-0.md @@ -1,9 +1,14 @@ --- title: "Upgrading to 0.17.0" id: "upgrading-to-0-17-0" +displayed_sidebar: "docs" --- +import UpgradeMove from '/snippets/_upgrade-move.md'; + + + dbt v0.17.0 makes compilation more consistent, improves performance, and fixes a number of bugs. ## Articles: diff --git a/website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-18-0.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-18-0.md similarity index 97% rename from website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-18-0.md rename to website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-18-0.md index 8092ad807b8..545bfd41ac6 100644 --- a/website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-18-0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-18-0.md @@ -1,8 +1,13 @@ --- title: "Upgrading to 0.18.0" +displayed_sidebar: "docs" --- +import UpgradeMove from '/snippets/_upgrade-move.md'; + + + ### Resources - [Changelog](https://github.com/dbt-labs/dbt-core/blob/dev/marian-anderson/CHANGELOG.md) diff --git a/website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-19-0.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-19-0.md similarity index 96% rename from website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-19-0.md rename to website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-19-0.md index 0dd428780e0..db825d8af9c 100644 --- a/website/docs/guides/migration/versions/11-Older versions/upgrading-to-0-19-0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-19-0.md @@ -1,8 +1,13 @@ --- title: "Upgrading to 0.19.0" +displayed_sidebar: "docs" --- +import UpgradeMove from '/snippets/_upgrade-move.md'; + + + ### Resources - [Discourse](https://discourse.getdbt.com/t/1951) @@ -23,7 +28,7 @@ See the docs below for more details. We don't expect these to require action in #### Deprecations -Removed support for `config-version: 1` of dbt_project.yml, which was deprecated in v0.17.0. Use `config-version: 2` in all projects and installed packages. Otherwise, dbt will raise an error. See docs on [config-version](/reference/project-configs/config-version) and the [v0.17.0 Migration Guide](/guides/migration/versions) for details. +Removed support for `config-version: 1` of dbt_project.yml, which was deprecated in v0.17.0. Use `config-version: 2` in all projects and installed packages. Otherwise, dbt will raise an error. See docs on [config-version](/reference/project-configs/config-version) and the [v0.17.0 Migration Guide](/docs/dbt-versions/core-upgrade) for details. ### For dbt plugin maintainers diff --git a/website/docs/docs/dbt-versions/core-versions.md b/website/docs/docs/dbt-versions/core-versions.md index 2a5ce6daeb7..5e8e437f0b1 100644 --- a/website/docs/docs/dbt-versions/core-versions.md +++ b/website/docs/docs/dbt-versions/core-versions.md @@ -2,6 +2,8 @@ title: "About dbt Core versions" id: "core" description: "Learn about semantic versioning for dbt Core, and how long those versions are supported." +pagination_next: "docs/dbt-versions/upgrade-core-in-cloud" +pagination_prev: null --- dbt Core releases follow [semantic versioning](https://semver.org/) guidelines. For more on how we use semantic versions, see [How dbt Core uses semantic versioning](#how-dbt-core-uses-semantic-versioning). diff --git a/website/docs/docs/dbt-versions/experimental-features.md b/website/docs/docs/dbt-versions/experimental-features.md index 5ed0cf037ca..a621bd4ac44 100644 --- a/website/docs/docs/dbt-versions/experimental-features.md +++ b/website/docs/docs/dbt-versions/experimental-features.md @@ -3,6 +3,7 @@ title: "Preview new and experimental features in dbt Cloud" id: "experimental-features" sidebar_label: "Preview new dbt Cloud features" description: "Gain early access to many new dbt Labs experimental features by enabling this in your profile." +pagination_next: null --- dbt Labs often tests experimental features before deciding to continue on the [Product lifecycle](https://docs.getdbt.com/docs/dbt-versions/product-lifecycles#dbt-cloud). diff --git a/website/docs/docs/dbt-versions/release-notes.md b/website/docs/docs/dbt-versions/release-notes.md index db25af163ae..6f7be90e60d 100644 --- a/website/docs/docs/dbt-versions/release-notes.md +++ b/website/docs/docs/dbt-versions/release-notes.md @@ -2,6 +2,8 @@ title: "About dbt Cloud Release Notes" id: "dbt-cloud-release-notes" description: "Release notes for dbt Cloud" +pagination_next: null +pagination_prev: null --- dbt provides release notes for dbt Cloud so you can see recent and historical changes. Generally, you'll see release notes for these changes: diff --git a/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/api-v2v3-limit.md b/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/api-v2v3-limit.md new file mode 100644 index 00000000000..9768886d5fb --- /dev/null +++ b/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/api-v2v3-limit.md @@ -0,0 +1,15 @@ +--- +title: "API results limited to `100`" +id: apiv3-limit" +description: "Oct 2023: In order to enhance the efficiency and stability of our services, we will limit all API results to `100` records. This limit is applicable to multi-tenant instances only." +sidebar_label: "Update: API results limited to `100`" +sidebar_position: 04 +tags: [Oct-2023, API] +--- + + +Beginning December 1, 2023, the [Administrative API](/docs/dbt-cloud-apis/admin-cloud-api) v2 and v3 will expect you to limit all "list" or `GET` API methods to 100 results per API request. This limit enhances the efficiency and stability of our services. If you need to handle more than 100 results, then use the `limit` and `offset` query parameters to paginate those results; otherwise, you will receive an error. + +This maximum limit applies to [multi-tenant instances](/docs/cloud/about-cloud/regions-ip-addresses) only, and _does not_ apply to single tenant instances. + +Refer to the [API v3 Pagination](https://docs.getdbt.com/dbt-cloud/api-v3#/) or [API v2 Pagination](https://docs.getdbt.com/dbt-cloud/api-v2#/) sections for more information on how to paginate your API responses. diff --git a/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/cloud-cli-pp.md b/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/cloud-cli-pp.md new file mode 100644 index 00000000000..d96b82636f8 --- /dev/null +++ b/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/cloud-cli-pp.md @@ -0,0 +1,31 @@ +--- +title: "New: dbt Cloud CLI in Public Preview" +description: "October 2023: Learn about the new dbt Cloud CLI development experience, now in public preview," +sidebar_position: 04 +sidebar_label: "New: dbt Cloud CLI in Public Preview" +tags: [Oct-2023, CLI, dbt Cloud] +date: 2023-10-17 +--- + +We are excited to announce the dbt Cloud CLI, **unified command line for dbt**, is available in public preview. It’s a local development experience, powered by dbt Cloud. It’s easy to get started: `pip3 install dbt` or `brew install dbt` and you’re ready to go. + +We will continue to invest in the dbt Cloud IDE as the easiest and most accessible way to get started using dbt, especially for data analysts who have never developed software using the command line before. We will keep improving the speed, stability, and feature richness of the IDE, as we have been [all year long](https://www.getdbt.com/blog/improvements-to-the-dbt-cloud-ide/). + +We also know that many people developing in dbt have a preference for local development, where they can use their favorite terminal, text editor, keybindings, color scheme, and so on. This includes people with data engineering backgrounds, as well as those analytics engineers who started writing code in the dbt Cloud IDE and have expanded their skills. + +The new dbt Cloud CLI offers the best of both worlds, including: + +- The power of developing against the dbt Cloud platform +- The flexibility of your own local setup + +Run whichever community-developed plugins, pre-commit hooks, or other arbitrary scripts you like. + +Some of the unique capabilities of this dbt Cloud CLI include: + +- Automatic deferral of build artifacts to your Cloud project's production environment +- Secure credential storage in the dbt Cloud platform +- Support for dbt Mesh ([cross-project `ref`](/docs/collaborate/govern/project-dependencies)) +- Development workflow for dbt Semantic Layer +- Speedier, lower cost builds + +Refer to [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) to learn more. diff --git a/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/custom-branch-fix-rn.md b/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/custom-branch-fix-rn.md new file mode 100644 index 00000000000..06550b7d863 --- /dev/null +++ b/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/custom-branch-fix-rn.md @@ -0,0 +1,14 @@ +--- +title: "Fix: Default behavior for CI job runs without a custom branch" +description: "October 2023: CI job runs now default to the main branch of the Git repository when a custom branch isn't set" +sidebar_label: "Fix: Default behavior for CI job runs without a custom branch" +tags: [Oct-2023, CI] +date: 2023-10-06 +sidebar_position: 08 +--- + +If you don't set a [custom branch](/docs/dbt-cloud-environments#custom-branch-behavior) for your dbt Cloud environment, it now defaults to the default branch of your Git repository (for example, `main`). Previously, [CI jobs](/docs/deploy/ci-jobs) would run for pull requests (PRs) that were opened against _any branch_ or updated with new commits if the **Custom Branch** option wasn't set. + +## Azure DevOps + +Your Git pull requests (PRs) might not trigger against your default branch if you're using Azure DevOps and the default branch isn't `main` or `master`. To resolve this, [set up a custom branch](/faqs/Environments/custom-branch-settings) with the branch you want to target. diff --git a/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/dbt-deps-auto-install.md b/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/dbt-deps-auto-install.md new file mode 100644 index 00000000000..80963a9d550 --- /dev/null +++ b/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/dbt-deps-auto-install.md @@ -0,0 +1,21 @@ +--- +title: "Enhancement: dbt Cloud auto-installs 'dbt deps' on startup" +description: "October 2023 :The dbt Cloud IDE and dbt Cloud CLI auto-handles 'dbt deps' on startup; manual run needed for 'packages.yml' changes. Available for multi-tenant users (single-tenant support coming soon) and applies to all dbt versions." +sidebar_label: "Enhancement: dbt Cloud auto-installs 'dbt deps' on startup" +tags: [Oct-2023, IDE] +date: 2023-10-17 +sidebar_position: 06 +--- + +The dbt Cloud IDE and dbt Cloud CLI now automatically installs `dbt deps` when your environment starts or when necessary. Previously, it would prompt you to run `dbt deps` during initialization. + +This improved workflow is available to all multi-tenant dbt Cloud users (Single-tenant support coming next week) and applies to dbt versions. + +However, you should still run the `dbt deps` command in these situations: + +- When you make changes to the `packages.yml` or `dependencies.yml` file during a session +- When you update the package version in the `packages.yml` or `dependencies.yml` file. +- If you edit the `dependencies.yml` file and the number of packages remains the same, run `dbt deps`. (Note that this is a known bug dbt Labs will fix in the future.) + + + diff --git a/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/explorer-public-preview-rn.md b/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/explorer-public-preview-rn.md new file mode 100644 index 00000000000..ebf5add8d03 --- /dev/null +++ b/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/explorer-public-preview-rn.md @@ -0,0 +1,13 @@ +--- +title: "New: dbt Explorer Public Preview" +description: "October 2023: dbt Explorer is now available in Public Preview. You can use it to understand, improve, and leverage your dbt projects." +sidebar_label: "New: dbt Explorer Public Preview" +tags: [Oct-2023, Explorer] +date: 2023-10-13 +sidebar_position: 07 +--- + +On Oct 17, 2023, a Public Preview of dbt Explorer will become available to dbt Cloud customers. With dbt Explorer, you can view your project's resources (such as models, tests, and metrics) and their lineage — including interactive DAGs — to gain a better understanding of its latest production state. Navigate and manage your projects within dbt Cloud to help you and other data developers, analysts, and consumers discover and leverage your dbt resources. + +For details, refer to [Explore your dbt projects](/docs/collaborate/explore-projects). + diff --git a/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/native-retry-support-rn.md b/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/native-retry-support-rn.md new file mode 100644 index 00000000000..20e56879940 --- /dev/null +++ b/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/native-retry-support-rn.md @@ -0,0 +1,15 @@ +--- +title: "Enhancement: Native support for the dbt retry command" +description: "October 2023: Rerun errored jobs from start or from the failure point" +sidebar_label: "Enhancement: Support for dbt retry" +tags: [Oct-2023, Scheduler] +date: 2023-10-06 +sidebar_position: 10 +--- + +Previously in dbt Cloud, you could only rerun an errored job from start but now you can also rerun it from its point of failure. + +You can view which job failed to complete successully, which command failed in the run step, and choose how to rerun it. To learn more, refer to [Retry jobs](/docs/deploy/retry-jobs). + + + diff --git a/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/product-docs-sept-rn.md b/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/product-docs-sept-rn.md new file mode 100644 index 00000000000..e669b037d17 --- /dev/null +++ b/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/product-docs-sept-rn.md @@ -0,0 +1,38 @@ +--- +title: "September 2023 product docs updates" +id: "product-docs-sept" +description: "September 2023: The Product docs team merged 107 PRs, made various updates to dbt Cloud and Core, such as GAing continuous integration jobs, Semantic Layer GraphQL API doc, a new community plugin, and more" +sidebar_label: "Update: Product docs changes" +tags: [Sept-2023, product-docs] +date: 2023-10-10 +sidebar_position: 09 +--- + +Hello from the dbt Docs team: @mirnawong1, @matthewshaver, @nghi-ly, and @runleonarun! First, we’d like to thank the 15 new community contributors to docs.getdbt.com. We merged [107 PRs](https://github.com/dbt-labs/docs.getdbt.com/pulls?q=is%3Apr+merged%3A2023-09-01..2023-09-31) in September. + +Here's what's new to [docs.getdbt.com](http://docs.getdbt.com/): + +* Migrated docs.getdbt.com from Netlify to Vercel. + +## ☁ Cloud projects +- Continuous integration jobs are now generally available and no longer in beta! +- Added [Postgres PrivateLink set up page](/docs/cloud/secure/postgres-privatelink) +- Published beta docs for [dbt Explorer](/docs/collaborate/explore-projects). +- Added a new Semantic Layer [GraphQL API doc](/docs/dbt-cloud-apis/sl-graphql) and updated the [integration docs](/docs/use-dbt-semantic-layer/avail-sl-integrations) to include Hex. Responded to dbt community feedback and clarified Metricflow use cases for dbt Core and dbt Cloud. +- Added an [FAQ](/faqs/Git/git-migration) describing how to migrate from one git provider to another in dbt Cloud. +- Clarified an example and added a [troubleshooting section](/docs/cloud/connect-data-platform/connect-snowflake#troubleshooting) to Snowflake connection docs to address common errors and provide solutions. + + +## 🎯 Core projects + +- Deprecated dbt Core v1.0 and v1.1 from the docs. +- Added configuration instructions for the [AWS Glue](/docs/core/connect-data-platform/glue-setup) community plugin. +- Revised the dbt Core quickstart, making it easier to follow. Divided this guide into steps that align with the [other guides](/quickstarts/manual-install?step=1). + +## New 📚 Guides, ✏️ blog posts, and FAQs + +Added a [style guide template](/guides/best-practices/how-we-style/6-how-we-style-conclusion#style-guide-template) that you can copy & paste to make sure you adhere to best practices when styling dbt projects! + +## Upcoming changes + +Stay tuned for a flurry of releases in October and a filterable guides section that will make guides easier to find! diff --git a/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/sl-ga.md b/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/sl-ga.md new file mode 100644 index 00000000000..5e53363f62a --- /dev/null +++ b/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/sl-ga.md @@ -0,0 +1,29 @@ +--- +title: "Update: dbt Cloud Semantic Layer is Generally Available" +description: "October 2023: dbt Cloud Semantic Layer is Generally Available for all users" +sidebar_label: "Update: dbt Cloud Semantic Layer is GA" +sidebar_position: 05 +date: 2023-10-17 +tags: [Oct-2023] +--- + +:::important +If you're using the legacy Semantic Layer, we **highly** recommend you [upgrade your dbt version](/docs/dbt-versions/upgrade-core-in-cloud) to dbt v1.6 or higher and [migrate](/guides/migration/sl-migration) to the latest Semantic Layer. +::: + +dbt Labs is thrilled to announce that the [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) is now generally available. It offers consistent data organization, improved governance, reduced costs, enhanced efficiency, and accessible data for better decision-making and collaboration across organizations. + +It aims to bring the best of modeling and semantics to downstream applications by introducing: + +- Brand new [integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations) such as Tableau, Google Sheets, Hex, Mode, and Lightdash. +- New [Semantic Layer APIs](/docs/dbt-cloud-apis/sl-api-overview) using GraphQL and JDBC to query metrics and build integrations. +- dbt Cloud [multi-tenant regional](/docs/cloud/about-cloud/regions-ip-addresses) support for North America, EMEA, and APAC. Single-tenant support coming soon. +- Use the APIs to call an export (a way to build tables in your data platform), then access them in your preferred BI tool. Starting from dbt v1.7 or higher, you will be able to schedule exports as part of your dbt job. + + + +The dbt Semantic Layer is available to [dbt Cloud Team or Enterprise](https://www.getdbt.com/) multi-tenant plans on dbt v1.6 or higher. +- Team and Enterprise customers can use 1,000 Queried Units per month for no additional cost on a limited trial basis, subject to reasonable use limitations. Refer to [Billing](/docs/cloud/billing#what-counts-as-a-query-unit) for more information. +- dbt Cloud Developer plans and dbt Core users can define metrics but won't be able to query them with integrated tools. + + diff --git a/website/docs/docs/dbt-versions/release-notes/04-Sept-2023/product-docs-summer-rn.md b/website/docs/docs/dbt-versions/release-notes/04-Sept-2023/product-docs-summer-rn.md index a647bb5f585..d8148542eef 100644 --- a/website/docs/docs/dbt-versions/release-notes/04-Sept-2023/product-docs-summer-rn.md +++ b/website/docs/docs/dbt-versions/release-notes/04-Sept-2023/product-docs-summer-rn.md @@ -35,7 +35,7 @@ You can provide feedback by opening a pull request or issue in [our repo](https: * Added a section to introduce a new beta feature [**Extended Attributes**](/docs/dbt-cloud-environments#extended-attributes-beta), which allows users to set a flexible `profiles.yml` snippet in their dbt Cloud Environment settings. ## 🎯 Core projects -* We released [dbt 1.6](/guides/migration/versions/upgrading-to-v1.6)! We added docs for the new commands `dbt retry` and `dbt clone` +* We released [dbt 1.6](/docs/dbt-versions/core-upgrade/upgrading-to-v1.6)! We added docs for the new commands `dbt retry` and `dbt clone` ## New 📚 Guides, ✏️ blog posts, and FAQs * Check out how these community members use the dbt community in the [Community spotlight](/community/spotlight). diff --git a/website/docs/docs/dbt-versions/release-notes/09-April-2023/product-docs.md b/website/docs/docs/dbt-versions/release-notes/09-April-2023/product-docs.md index d30bcf85b99..d78040ea7e4 100644 --- a/website/docs/docs/dbt-versions/release-notes/09-April-2023/product-docs.md +++ b/website/docs/docs/dbt-versions/release-notes/09-April-2023/product-docs.md @@ -26,7 +26,7 @@ Hello from the dbt Docs team: @mirnawong1, @matthewshaver, @nghi-ly, and @runleo ## 🎯 Core projects - Clearer descriptions in the [Jinja functions page](/reference/dbt-jinja-functions), that improve content for each card.  -- [1.5 Docs](/guides/migration/versions/upgrading-to-v1.5) have been released as an RC! +- [1.5 Docs](/docs/dbt-versions/core-upgrade/upgrading-to-v1.5) have been released as an RC! - See the beautiful [work captured in Core v 1.5](https://github.com/dbt-labs/docs.getdbt.com/issues?q=is%3Aissue+label%3A%22dbt-core+v1.5%22+is%3Aclosed). ## New 📚 Guides and ✏️ blog posts diff --git a/website/docs/docs/dbt-versions/release-notes/10-Mar-2023/1.0-deprecation.md b/website/docs/docs/dbt-versions/release-notes/10-Mar-2023/1.0-deprecation.md index b11bf702330..6b6f646e40e 100644 --- a/website/docs/docs/dbt-versions/release-notes/10-Mar-2023/1.0-deprecation.md +++ b/website/docs/docs/dbt-versions/release-notes/10-Mar-2023/1.0-deprecation.md @@ -17,5 +17,5 @@ Refer to some additional info and resources to help you upgrade your dbt version - [How to upgrade dbt without fear](https://docs.getdbt.com/blog/upgrade-dbt-without-fear) - [Upgrade Q&A on breaking changes](/docs/dbt-versions/upgrade-core-in-cloud#upgrading-legacy-versions-under-10) -- [Version migration guides](/guides/migration/versions) +- [Version migration guides](/docs/dbt-versions/core-upgrade) diff --git a/website/docs/docs/dbt-versions/release-notes/35-dbt-cloud-changelog-2019-2020.md b/website/docs/docs/dbt-versions/release-notes/35-dbt-cloud-changelog-2019-2020.md index b8e15b993de..a6b68cf9d51 100644 --- a/website/docs/docs/dbt-versions/release-notes/35-dbt-cloud-changelog-2019-2020.md +++ b/website/docs/docs/dbt-versions/release-notes/35-dbt-cloud-changelog-2019-2020.md @@ -197,7 +197,7 @@ initial support for a GitLab integration and self-service RBAC configuration. ## dbt Cloud v1.1.7 [September 3, 2020] This release adds a Release Candidate for [dbt -v0.18.0](/guides/migration/versions) and +v0.18.0](/docs/dbt-versions/core-upgrade) and includes bugfixes and improvements to the Cloud IDE and job scheduler. diff --git a/website/docs/docs/dbt-versions/upgrade-core-in-cloud.md b/website/docs/docs/dbt-versions/upgrade-core-in-cloud.md index d143aab5ef1..e46294029ec 100644 --- a/website/docs/docs/dbt-versions/upgrade-core-in-cloud.md +++ b/website/docs/docs/dbt-versions/upgrade-core-in-cloud.md @@ -47,7 +47,7 @@ For more on version support and future releases, see [Understanding dbt Core ver #### Need help upgrading? -If you want more advice on how to upgrade your dbt projects, check out our [migration guides](/guides/migration/versions/) and our [upgrading Q&A page](/docs/dbt-versions/upgrade-core-in-cloud#upgrading-legacy-versions-under-10). +If you want more advice on how to upgrade your dbt projects, check out our [migration guides](/docs/dbt-versions/core-upgrade/) and our [upgrading Q&A page](/docs/dbt-versions/upgrade-core-in-cloud#upgrading-legacy-versions-under-10). ## Upgrading legacy versions under 1.0 @@ -96,7 +96,7 @@ clean-targets: - Do you have custom scripts that parse dbt artifacts? - (BigQuery only) Do you use dbt's legacy capabilities around ingestion-time-partitioned tables? -If you believe your project might be affected, read more details in the migration guide [here](/guides/migration/versions/upgrading-to-v1.0). +If you believe your project might be affected, read more details in the migration guide [here](/docs/dbt-versions/core-upgrade/upgrading-to-v1.0). @@ -109,7 +109,7 @@ If you believe your project might be affected, read more details in the migratio - Do you have custom scripts that parse dbt JSON artifacts? - (Snowflake only) Do you have custom macros or materializations that depend on using transactions, such as statement blocks with `auto_begin=True`? -If you believe your project might be affected, read more details in the migration guide [here](/guides/migration/versions). +If you believe your project might be affected, read more details in the migration guide [here](/docs/dbt-versions/core-upgrade). @@ -123,7 +123,7 @@ If you believe your project might be affected, read more details in the migratio - Does your project use `adapter.dispatch` or the `spark_utils` package? - Do you have custom scripts that parse dbt JSON artifacts? -If you believe your project might be affected, read more details in the migration guide [here](/guides/migration/versions). +If you believe your project might be affected, read more details in the migration guide [here](/docs/dbt-versions/core-upgrade). @@ -146,7 +146,7 @@ See **Upgrading to v0.17.latest from v0.16** below for more details. - Do you have custom scripts that parse dbt JSON artifacts? - Do you have any custom materializations? -If you believe your project might be affected, read more details in the migration guide [here](/guides/migration/versions). +If you believe your project might be affected, read more details in the migration guide [here](/docs/dbt-versions/core-upgrade). @@ -157,7 +157,7 @@ If you believe your project might be affected, read more details in the migratio - Do you directly call `adapter_macro`? -If you believe your project might be affected, read more details in the migration guide [here](/guides/migration/versions). +If you believe your project might be affected, read more details in the migration guide [here](/docs/dbt-versions/core-upgrade). @@ -235,7 +235,7 @@ models: ``` -If you believe your project might be affected, read more details in the migration guide [here](/guides/migration/versions). +If you believe your project might be affected, read more details in the migration guide [here](/docs/dbt-versions/core-upgrade). @@ -247,7 +247,7 @@ If you believe your project might be affected, read more details in the migratio - Do you use the custom `generate_schema_name` macro? - Do you use `partition_by` config for BigQuery models? -If you believe your project might be affected, read more details in the migration guide [here](/guides/migration/versions). +If you believe your project might be affected, read more details in the migration guide [here](/docs/dbt-versions/core-upgrade). @@ -259,7 +259,7 @@ If you believe your project might be affected, read more details in the migratio - Do you have a custom materialization? - Do you have a macro that accesses `Relations` directly? -If you believe your project might be affected, read more details in the migration guide [here](/guides/migration/versions). +If you believe your project might be affected, read more details in the migration guide [here](/docs/dbt-versions/core-upgrade).
@@ -270,7 +270,7 @@ If you believe your project might be affected, read more details in the migratio - Do you use the custom `generate_schema_name` macro? - Do you use the `—non-destructive` flag? -If you believe your project might be affected, read more details in the migration guide [here](/guides/migration/versions). +If you believe your project might be affected, read more details in the migration guide [here](/docs/dbt-versions/core-upgrade).
diff --git a/website/docs/docs/deploy/ci-jobs.md b/website/docs/docs/deploy/ci-jobs.md index fb603e2864e..d10bc780fc2 100644 --- a/website/docs/docs/deploy/ci-jobs.md +++ b/website/docs/docs/deploy/ci-jobs.md @@ -27,6 +27,7 @@ To make CI job creation easier, many options on the **CI job** page are set to d - **Job Name** — Specify the name for this CI job. - **Environment** — By default, it’s set to the environment you created the CI job from. - **Triggered by pull requests** — By default, it’s enabled. Every time a developer opens up a pull request or pushes a commit to an existing pull request, this job will get triggered to run. + - **Run on Draft Pull Request** — Enable this option if you want to also trigger the job to run every time a developer opens up a draft pull request or pushes a commit to that draft pull request. 3. Options in the **Execution Settings** section: - **Commands** — By default, it includes the `dbt build --select state:modified+` command. This informs dbt Cloud to build only new or changed models and their downstream dependents. Importantly, state comparison can only happen when there is a deferred environment selected to compare state to. Click **Add command** to add more [commands](/docs/deploy/job-commands) that you want to be invoked when this job runs. @@ -62,13 +63,13 @@ If you're not using dbt Cloud’s native Git integration with [GitHub](/docs/cl 1. Set up a CI job with the [Create Job](/dbt-cloud/api-v2#/operations/Create%20Job) API endpoint using `"job_type": ci` or from the [dbt Cloud UI](#set-up-ci-jobs). -1. Call the [Trigger Job Run](/dbt-cloud/api-v2#/operations/Trigger%20Job%20Run) API endpoint to trigger the CI job. Provide the pull request (PR) ID to the payload using one of these fields, even if you're using a different Git provider (like Bitbucket): +1. Call the [Trigger Job Run](/dbt-cloud/api-v2#/operations/Trigger%20Job%20Run) API endpoint to trigger the CI job. You must include these fields to the payload: + - Provide the pull request (PR) ID with one of these fields, even if you're using a different Git provider (like Bitbucket). This can make your code less human-readable but it will _not_ affect dbt functionality. - - `github_pull_request_id` - - `gitlab_merge_request_id` - - `azure_devops_pull_request_id`  - - This can make your code less human-readable but it will _not_ affect dbt functionality. + - `github_pull_request_id` + - `gitlab_merge_request_id` + - `azure_devops_pull_request_id`  + - Provide the `git_sha` or `git_branch` to target the correct commit or branch to run the job against. ## Example pull requests @@ -94,10 +95,18 @@ If you're experiencing any issues, review some of the common questions and answe
Temporary schemas aren't dropping
-
If your temporary schemas aren't dropping after a PR merges or closes, this typically indicates you have overridden the generate_schema_name macro and it isn't using dbt_cloud_pr_ as the prefix.



To resolve this, change your macro so that the temporary PR schema name contains the required prefix. For example: +
If your temporary schemas aren't dropping after a PR merges or closes, this typically indicates one of these issues: +
    +
  • You have overridden the generate_schema_name macro and it isn't using dbt_cloud_pr_ as the prefix.



    To resolve this, change your macro so that the temporary PR schema name contains the required prefix. For example:



    - • ✅ Temporary PR schema name contains the prefix dbt_cloud_pr_ (like dbt_cloud_pr_123_456_marketing)

    - • ❌ Temporary PR schema name doesn't contain the prefix dbt_cloud_pr_ (like marketing).

    + ✅ Temporary PR schema name contains the prefix dbt_cloud_pr_ (like dbt_cloud_pr_123_456_marketing).

    + ❌ Temporary PR schema name doesn't contain the prefix dbt_cloud_pr_ (like marketing).

    +
  • +
    +
  • + A macro is creating a schema but there are no dbt models writing to that schema. dbt Cloud doesn't drop temporary schemas that weren't written to as a result of running a dbt model. +
  • +
@@ -153,6 +162,3 @@ If you're experiencing any issues, review some of the common questions and answe If you're on a Virtual Private dbt Enterprise plan using security features like ingress PrivateLink or IP Allowlisting, registering CI hooks may not be available and can cause the job to fail silently.
- - - diff --git a/website/docs/docs/deploy/continuous-integration.md b/website/docs/docs/deploy/continuous-integration.md index cc856f97f22..0f87965aada 100644 --- a/website/docs/docs/deploy/continuous-integration.md +++ b/website/docs/docs/deploy/continuous-integration.md @@ -16,7 +16,7 @@ Using CI helps: ## How CI works -When you [set up CI jobs](/docs/deploy/ci-jobs#set-up-ci-jobs), dbt Cloud listens for webhooks from your Git provider indicating that a new PR has been opened or updated with new commits. When dbt Cloud receives one of these webhooks, it enqueues a new run of the CI job. If you want CI checks to run on each new commit, you need to mark your PR as **Ready for review** in your Git provider — draft PRs _don't_ trigger CI jobs. +When you [set up CI jobs](/docs/deploy/ci-jobs#set-up-ci-jobs), dbt Cloud listens for webhooks from your Git provider indicating that a new PR has been opened or updated with new commits. When dbt Cloud receives one of these webhooks, it enqueues a new run of the CI job. dbt Cloud builds and tests the models affected by the code change in a temporary schema, unique to the PR. This process ensures that the code builds without error and that it matches the expectations as defined by the project's dbt tests. The unique schema name follows the naming convention `dbt_cloud_pr__` (for example, `dbt_cloud_pr_1862_1704`) and can be found in the run details for the given run, as shown in the following image: diff --git a/website/docs/docs/deploy/deployment-overview.md b/website/docs/docs/deploy/deployment-overview.md index 5883ecaa3f1..29934663544 100644 --- a/website/docs/docs/deploy/deployment-overview.md +++ b/website/docs/docs/deploy/deployment-overview.md @@ -4,6 +4,8 @@ id: "deployments" sidebar: "Use dbt Cloud's capabilities to seamlessly run a dbt job in production." hide_table_of_contents: true tags: ["scheduler"] +pagination_next: "docs/deploy/job-scheduler" +pagination_prev: null --- Use dbt Cloud's capabilities to seamlessly run a dbt job in production or staging environments. Rather than run dbt commands manually from the command line, you can leverage the [dbt Cloud's in-app scheduling](/docs/deploy/job-scheduler) to automate how and when you execute dbt. @@ -58,6 +60,12 @@ Learn how to use dbt Cloud's features to help your team ship timely and quality link="/docs/deploy/run-visibility" icon="dbt-bit"/> + + + +## Related content +- [Retry a failed run for a job](/dbt-cloud/api-v2#/operations/Retry%20a%20failed%20run%20for%20a%20job) API endpoint +- [Run visibility](/docs/deploy/run-visibility) +- [Jobs](/docs/deploy/jobs) +- [Job commands](/docs/deploy/job-commands) \ No newline at end of file diff --git a/website/docs/docs/environments-in-dbt.md b/website/docs/docs/environments-in-dbt.md index 54eaa68f667..70bc096cf4f 100644 --- a/website/docs/docs/environments-in-dbt.md +++ b/website/docs/docs/environments-in-dbt.md @@ -2,6 +2,7 @@ title: "About environments" id: "environments-in-dbt" hide_table_of_contents: true +pagination_next: null --- In software engineering, environments are used to enable engineers to develop and test code without impacting the users of their software. Typically, there are two types of environments in dbt: @@ -18,7 +19,7 @@ Configure environments to tell dbt Cloud or dbt Core how to build and execute yo diff --git a/website/docs/docs/introduction.md b/website/docs/docs/introduction.md index c4cfd6e45ac..0aeef0201cb 100644 --- a/website/docs/docs/introduction.md +++ b/website/docs/docs/introduction.md @@ -1,6 +1,8 @@ --- title: "What is dbt?" id: "introduction" +pagination_next: null +pagination_prev: null --- @@ -28,6 +30,7 @@ Read more about why we want to enable analysts to work more like software engine You can access dbt using dbt Core or dbt Cloud. dbt Cloud is built around dbt Core, but it also provides: - Web-based UI so it’s more accessible +- dbt Cloud-powered command line (CLI) to develop, test, version control dbt projects, and run dbt commands - Hosted environment so it’s faster to get up and running - Differentiated features, such as metadata, in-app job scheduler, observability, integrations with other tools, integrated development environment (IDE), and more. @@ -35,7 +38,8 @@ You can learn about plans and pricing on [www.getdbt.com](https://www.getdbt.com ### dbt Cloud -dbt Cloud is the fastest and most reliable way to deploy dbt. Develop, test, schedule, and investigate data models all in one web-based UI. Learn more about [dbt Cloud features](/docs/cloud/about-cloud/dbt-cloud-features) and try one of the [dbt Cloud quickstarts](/quickstarts). +dbt Cloud is the fastest and most reliable way to deploy dbt. Develop, test, schedule, and investigate data models all in one web-based UI. It also natively supports developing using a command line with the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation). +Learn more about [dbt Cloud features](/docs/cloud/about-cloud/dbt-cloud-features) and try one of the [dbt Cloud quickstarts](/quickstarts). ### dbt Core diff --git a/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md b/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md index 9bd57e0b280..b3b6ffb3e45 100644 --- a/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md +++ b/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md @@ -1,14 +1,25 @@ --- title: "Run your dbt projects" id: "run-your-dbt-projects" +pagination_prev: null --- -You can run your dbt projects with [dbt Cloud](/docs/cloud/about-cloud/dbt-cloud-features) and [dbt Core](https://github.com/dbt-labs/dbt-core). dbt Cloud is a hosted application where you can develop directly from a web browser. dbt Core is an open source project where you can develop from the command line. +You can run your dbt projects with [dbt Cloud](/docs/cloud/about-cloud/dbt-cloud-features) or [dbt Core](https://github.com/dbt-labs/dbt-core): -Among other features, dbt Cloud provides a development environment to help you build, test, run, and [version control](/docs/collaborate/git-version-control) your project faster. It also includes an easier way to share your [dbt project's documentation](/docs/collaborate/build-and-view-your-docs) with your team. These development tasks are directly built into dbt Cloud for an _integrated development environment_ (IDE). Refer to [Develop in the Cloud](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) for more details. +- **dbt Cloud**: A hosted application where you can develop directly from a web browser using the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud). It also natively supports developing using a command line interface, [dbt Cloud CLI](/docs/cloud/cloud-cli-installation). Among other features, dbt Cloud provides: -With dbt Core, you can run your dbt projects from the command line. The command line interface (CLI) is available from your computer's terminal application such as Terminal and iTerm. When using the command line, you can run commands and do other work from the current working directory on your computer. Before running the dbt project from the command line, make sure you are working in your dbt project directory. Learning terminal commands such as `cd` (change directory), `ls` (list directory contents), and `pwd` (present working directory) can help you navigate the directory structure on your system. + - Development environment to help you build, test, run, and [version control](/docs/collaborate/git-version-control) your project faster. + - Share your [dbt project's documentation](/docs/collaborate/build-and-view-your-docs) with your team. + - Integrates with the dbt Cloud IDE, allowing you to run development tasks and environment in the dbt Cloud UI for a seamless experience. + - The dbt Cloud CLI to develop and run dbt commands against your dbt Cloud development environment from your local command line. + - For more details, refer to [Develop in the Cloud](/docs/cloud/about-cloud-develop). -When running your project from dbt Core or dbt Cloud, the commands you commonly use are: +- **dbt Core**: An open source project where you can develop from the [command line](/docs/core/about-dbt-core). + +The dbt Cloud CLI and dbt Core are both command line tools that enable you to run dbt commands. The key distinction is the dbt Cloud CLI is tailored for dbt Cloud's infrastructure and integrates with all its [features](/docs/cloud/about-cloud/dbt-cloud-features). + +The command line is available from your computer's terminal application such as Terminal and iTerm. With the command line, you can run commands and do other work from the current working directory on your computer. Before running the dbt project from the command line, make sure you are working in your dbt project directory. Learning terminal commands such as `cd` (change directory), `ls` (list directory contents), and `pwd` (present working directory) can help you navigate the directory structure on your system. + +In dbt Cloud or dbt Core, the commands you commonly use are: - [dbt run](/reference/commands/run) — Runs the models you defined in your project - [dbt build](/reference/commands/build) — Builds and tests your selected resources such as models, seeds, snapshots, and tests @@ -20,6 +31,7 @@ For information on all dbt commands and their arguments (flags), see the [dbt co - [How we set up our computers for working on dbt projects](https://discourse.getdbt.com/t/how-we-set-up-our-computers-for-working-on-dbt-projects/243) - [Model selection syntax](/reference/node-selection/syntax) +- [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) - [Cloud IDE features](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud#ide-features) - [Does dbt offer extract and load functionality?](/faqs/Project/transformation-tool) - [Why does dbt compile need a data platform connection](/faqs/Warehouse/db-connection-dbt-compile) diff --git a/website/docs/docs/running-a-dbt-project/using-threads.md b/website/docs/docs/running-a-dbt-project/using-threads.md index 519ce8aab81..5eede7abc27 100644 --- a/website/docs/docs/running-a-dbt-project/using-threads.md +++ b/website/docs/docs/running-a-dbt-project/using-threads.md @@ -3,7 +3,7 @@ title: "Using threads" id: "using-threads" sidebar_label: "Use threads" description: "Understand what threads mean and how to use them." - +pagination_next: null --- When dbt runs, it creates a directed acyclic graph (DAG) of links between models. The number of threads represents the maximum number of paths through the graph dbt may work on at once – increasing the number of threads can minimize the run time of your project. @@ -18,7 +18,7 @@ Generally the optimal number of threads depends on your data warehouse and its c You can use a different number of threads than the value defined in your target by using the `--threads` option when executing a dbt command. -You will define the number of threads in your `profiles.yml` file (for CLI-users only), dbt Cloud job definition, and dbt Cloud development credentials under your profile. +You will define the number of threads in your `profiles.yml` file (for dbt Core users only), dbt Cloud job definition, and dbt Cloud development credentials under your profile. ## Related docs diff --git a/website/docs/docs/supported-data-platforms.md b/website/docs/docs/supported-data-platforms.md index 8ac782991c8..a8e146f49d0 100644 --- a/website/docs/docs/supported-data-platforms.md +++ b/website/docs/docs/supported-data-platforms.md @@ -4,14 +4,20 @@ id: "supported-data-platforms" sidebar_label: "Supported data platforms" description: "Connect dbt to any data platform in dbt Cloud or dbt Core, using a dedicated adapter plugin" hide_table_of_contents: true +pagination_next: "docs/connect-adapters" +pagination_prev: null --- dbt connects to and runs SQL against your database, warehouse, lake, or query engine. These SQL-speaking platforms are collectively referred to as _data platforms_. dbt connects with data platforms by using a dedicated adapter plugin for each. Plugins are built as Python modules that dbt Core discovers if they are installed on your system. Read [What are Adapters](/guides/dbt-ecosystem/adapter-development/1-what-are-adapters) for more info. -You can [connect](/docs/connect-adapters) to adapters and data platforms either directly in the dbt Cloud user interface (UI) or install them manually using the command line (CLI). +You can [connect](/docs/connect-adapters) to adapters and data platforms natively in dbt Cloud or install them manually using dbt Core. You can also further customize how dbt works with your specific data platform via configuration: see [Configuring Postgres](/reference/resource-configs/postgres-configs) for an example. +import MSCallout from '/snippets/_microsoft-adapters-soon.md'; + + + ## Types of Adapters There are three types of adapters available today: @@ -36,5 +42,5 @@ import AdaptersTrusted from '/snippets/_adapters-trusted.md'; -
* Install these adapters using the CLI as they're not currently supported in dbt Cloud.
+
* Install these adapters using dbt Core as they're not currently supported in dbt Cloud.
diff --git a/website/docs/docs/trusted-adapters.md b/website/docs/docs/trusted-adapters.md index e19bb40785f..08191e8ea42 100644 --- a/website/docs/docs/trusted-adapters.md +++ b/website/docs/docs/trusted-adapters.md @@ -6,7 +6,7 @@ hide_table_of_contents: true Trusted adapters are adapters not maintained by dbt Labs, that we feel comfortable recommending to users for use in production. -Free and open-source tools for the data professional are increasingly abundant. This is by-and-large a *good thing*, however it requires due dilligence that wasn't required in a paid-license, closed-source software world. As a user, there are questions to answer important before taking a dependency on an open-source project. The trusted adapter designation is meant to streamline this process for end users. +Free and open-source tools for the data professional are increasingly abundant. This is by-and-large a *good thing*, however it requires due diligence that wasn't required in a paid-license, closed-source software world. As a user, there are questions to answer important before taking a dependency on an open-source project. The trusted adapter designation is meant to streamline this process for end users.
Considerations for depending on an open-source project diff --git a/website/docs/docs/use-dbt-semantic-layer/avail-sl-integrations.md b/website/docs/docs/use-dbt-semantic-layer/avail-sl-integrations.md index b084dedc305..22178a14b5b 100644 --- a/website/docs/docs/use-dbt-semantic-layer/avail-sl-integrations.md +++ b/website/docs/docs/use-dbt-semantic-layer/avail-sl-integrations.md @@ -4,36 +4,35 @@ id: avail-sl-integrations description: "Discover the diverse range of partners that seamlessly integrate with the powerful dbt Semantic Layer, allowing you to query and unlock valuable insights from your data ecosystem." tags: [Semantic Layer] sidebar_label: "Available integrations" +hide_table_of_contents: true meta: api_name: dbt Semantic Layer APIs --- -import NewSLChanges from '/snippets/_new-sl-changes.md'; - - - There are a number of data applications that seamlessly integrate with the dbt Semantic Layer, powered by MetricFlow, from business intelligence tools to notebooks, spreadsheets, data catalogs, and more. These integrations allow you to query and unlock valuable insights from your data ecosystem. Use the [dbt Semantic Layer APIs](/docs/dbt-cloud-apis/sl-api-overview) to simplify metric queries, optimize your development workflow, and reduce coding. This approach also ensures data governance and consistency for data consumers. - - - import AvailIntegrations from '/snippets/_sl-partner-links.md'; ### Custom integration -You can create custom integrations using different languages and tools. We support connecting with JDBC, ADBC, and a GraphQL API. For more info, check out [our examples on GitHub](https://github.com/dbt-labs/example-semantic-layer-clients/). +- You can create custom integrations using different languages and tools. We support connecting with JDBC, ADBC, and GraphQL APIs. For more info, check out [our examples on GitHub](https://github.com/dbt-labs/example-semantic-layer-clients/). +- You can also connect to tools that allow you to write SQL. These tools must meet one of the two criteria: + + - Supports a generic JDBC driver option (such as DataGrip) or + - Uses Arrow Flight SQL JDBC driver version 12.0.0 or higher. ## Related docs -- {frontMatter.meta.api_name} to learn how to integrate with JDBC and GraphQL to query your metrics in downstream tools. -- [dbt Semantic Layer APIs query syntax](/docs/dbt-cloud-apis/sl-jdbc#querying-the-api-for-metric-metadata) +- {frontMatter.meta.api_name} to learn how to integrate and query your metrics in downstream tools. +- [dbt Semantic Layer API query syntax](/docs/dbt-cloud-apis/sl-jdbc#querying-the-api-for-metric-metadata) +- [Hex dbt Semantic Layer cells](https://learn.hex.tech/docs/logic-cell-types/transform-cells/dbt-metrics-cells) to set up SQL cells in Hex. diff --git a/website/docs/docs/use-dbt-semantic-layer/dbt-sl.md b/website/docs/docs/use-dbt-semantic-layer/dbt-sl.md index 76753b41ffa..8c78d556a67 100644 --- a/website/docs/docs/use-dbt-semantic-layer/dbt-sl.md +++ b/website/docs/docs/use-dbt-semantic-layer/dbt-sl.md @@ -5,13 +5,12 @@ description: "Learn how the dbt Semantic Layer enables data teams to centrally d sidebar_label: "About the dbt Semantic Layer" tags: [Semantic Layer] hide_table_of_contents: true +pagination_next: "docs/use-dbt-semantic-layer/quickstart-sl" +pagination_prev: null --- -import NewSLChanges from '/snippets/_new-sl-changes.md'; - - The dbt Semantic Layer, powered by [MetricFlow](/docs/build/about-metricflow), simplifies the process of defining and using critical business metrics, like `revenue` in the modeling layer (your dbt project). By centralizing metric definitions, data teams can ensure consistent self-service access to these metrics in downstream data tools and applications. The dbt Semantic Layer eliminates duplicate coding by allowing data teams to define metrics on top of existing models and automatically handles data joins. @@ -26,10 +25,8 @@ Refer to the [Why we need a universal semantic layer](https://www.getdbt.com/blo import Features from '/snippets/_sl-plan-info.md'
@@ -46,12 +43,6 @@ instance="hosted in North America" link="/docs/use-dbt-semantic-layer/setup-sl" icon="dbt-bit"/> - - + + diff --git a/website/docs/docs/use-dbt-semantic-layer/gsheets.md b/website/docs/docs/use-dbt-semantic-layer/gsheets.md new file mode 100644 index 00000000000..ee391c91b70 --- /dev/null +++ b/website/docs/docs/use-dbt-semantic-layer/gsheets.md @@ -0,0 +1,63 @@ +--- +title: "Google Sheets (beta)" +description: "Integrate with Google Sheets to query your metrics in a spreadsheet." +tags: [Semantic Layer] +sidebar_label: "Google Sheets (beta)" +--- + +:::info Beta functionality +Google Sheets integration with the dbt Semantic Layer is a [beta](/docs/dbt-versions/product-lifecycles#dbt-cloud) feature. +::: + +The dbt Semantic Layer offers a seamless integration with Google Sheets through a custom menu. This add-on allows you to build dbt Semantic Layer queries and return data on your metrics directly within Google Sheet. + +## Prerequisites + +1. You have a Google account with access to Google Sheets. +2. You can install Google add-ons. +3. You have [set up the dbt Semantic Layer](/docs/use-dbt-semantic-layer/setup-sl). +4. You have a dbt Cloud Environment ID and a [service token](/docs/dbt-cloud-apis/service-tokens) to authenticate with from a dbt Cloud account. + +## Installing the add-on + +1. Navigate to the [dbt Semantic Layer for Sheets App](https://gsuite.google.com/marketplace/app/foo/392263010968) to install the add-on. + + - You can also find it in Google Sheets by going to [**Extensions -> Add-on -> Get add-ons**](https://support.google.com/docs/answer/2942256?hl=en&co=GENIE.Platform%3DDesktop&oco=0#zippy=%2Cinstall-add-ons%2Cinstall-an-add-on) and searching for it there. +2. After installing, open the Add-On menu and select the "dbt Semantic Layer for Sheets". This will open a custom menu to the right-hand side of your screen. +3. Authenticate with your Host, dbt Cloud Environment ID, and Service Token. +4. Start querying your metrics using the **Query Builder**. For more info on the menu functions, refer to [Custom menu functions](#custom-menu-functions). + +When querying your data with Google Sheets: + +- It returns the data to the cell you have clicked on. +- The custom menu operation has a timeout limit of six (6) minutes. +- If you're using this extension, make sure you're signed into Chrome with the same Google profile you used to set up the Add-On. Log in with one Google profile at a time as using multiple Google profiles at once might cause issues. + + +## Custom menu functions + +The custom menu provides the following capabilities: + +| Menu items | Description | +|---------------|-------------------------------------------------------| +| Metrics | Search and select metrics. | +| Group By | Search and select dimensions to group by. Dimensions are grouped by the entity of the semantic model they come from. | +| Granularity | Modify the granularity of the primary time dimension. | +| Where | Filter your data. This includes categorical and time filters. | +| Order By | Return your data ordered. | +| Limit | Set a limit for the rows of your output. | + + +## Filtering data + +To use the filter functionality, choose the [dimension](docs/build/dimensions) you want to filter by and select the operation you want to filter on. + - For categorical dimensiosn, type in the dimension value you want to filter by (no quotes needed) and press enter. + - Continue adding additional filters as needed with AND and OR. If it's a time dimension, choose the operator and select from the calendar. + + + +**Limited Use Policy Disclosure** + +The dbt Semantic Layer for Sheet's use and transfer to any other app of information received from Google APIs will adhere to [Google API Services User Data Policy](https://developers.google.com/terms/api-services-user-data-policy), including the Limited Use requirements. + + diff --git a/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md b/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md index 3bbc11cea3f..cca899d227e 100644 --- a/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md +++ b/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md @@ -10,17 +10,15 @@ meta: -import NewSLChanges from '/snippets/_new-sl-changes.md'; -import InstallMetricFlow from '/snippets/_sl-install-metricflow.md'; + import CreateModel from '/snippets/_sl-create-semanticmodel.md'; import DefineMetrics from '/snippets/_sl-define-metrics.md'; import ConfigMetric from '/snippets/_sl-configure-metricflow.md'; import TestQuery from '/snippets/_sl-test-and-query-metrics.md'; +import ConnectQueryAPI from '/snippets/_sl-connect-and-query-api.md'; +import RunProdJob from '/snippets/_sl-run-prod-job.md'; - - - The dbt Semantic Layer, powered by [MetricFlow](/docs/build/about-metricflow), simplifies defining and using critical business metrics. It centralizes metric definitions, eliminates duplicate coding, and ensures consistent self-service access to metrics in downstream tools. MetricFlow, a powerful component of the dbt Semantic Layer, simplifies the creation and management of company metrics. It offers flexible abstractions, SQL query generation, and enables fast retrieval of metric datasets from a data platform. @@ -29,15 +27,15 @@ Use this guide to fully experience the power of the universal dbt Semantic Layer - [Create a semantic model](#create-a-semantic-model) in dbt Cloud using MetricFlow - [Define metrics](#define-metrics) in dbt Cloud using MetricFlow -- [Test and query metrics locally](#test-and-query-metrics) using MetricFlow +- [Test and query metrics](#test-and-query-metrics) with MetricFlow - [Run a production job](#run-a-production-job) in dbt Cloud - [Set up dbt Semantic Layer](#setup) in dbt Cloud - [Connect and query API](#connect-and-query-api) with dbt Cloud - -MetricFlow allows users to define metrics in their dbt project whether in dbt Cloud or in dbt Core. dbt Core users can use the [MetricFlow CLI](/docs/build/metricflow-cli) to define metrics in their local dbt Core project. +MetricFlow allows you to define metrics in your dbt project and query them whether in dbt Cloud or dbt Core with [MetricFlow commands](/docs/build/metricflow-commands). However, to experience the power of the universal [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) and query those metrics in downstream tools, you'll need a dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) account. + ## Prerequisites import SetUp from '/snippets/_v2-sl-prerequisites.md'; @@ -62,13 +60,8 @@ New to dbt or metrics? Try our [Jaffle shop example project](https://github.com/ ## Run a production job -Once you’ve defined metrics in your dbt project, you can perform a job run in your deployment environment in dbt Cloud to materialize your metrics. The deployment environment is only supported for the dbt Semantic Layer at this moment. -1. Go to **Deploy** in the navigation header -2. Select **Jobs** to re-run the job with the most recent code in the deployment environment. -3. Your metric should appear as a red node in the dbt Cloud IDE and dbt directed acyclic graphs (DAG). - - +
@@ -88,16 +81,7 @@ import SlSetUp from '/snippets/_new-sl-setup.md'; ## Connect and query API -You can query your metrics in a JDBC-enabled tool or use existing first-class integrations with the dbt Semantic Layer. - -You must have a dbt Cloud Team or Enterprise [multi-tenant](/docs/cloud/about-cloud/regions-ip-addresses) deployment, hosted in North America (Additional region support coming soon). - -- To learn how to use the JDBC or GraphQL API and what tools you can query it with, refer to the {frontMatter.meta.api_name}.
- - * To authenticate, you need to [generate a service token](/docs/dbt-cloud-apis/service-tokens) with Semantic Layer Only and Metadata Only permissions. - * Refer to the [SQL query syntax](/docs/dbt-cloud-apis/sl-jdbc#querying-the-api-for-metric-metadata) to query metrics using the APIs. - -- To learn more about the sophisticated integrations that connect to the dbt Semantic Layer, refer to [Available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations) for more info. + ## FAQs @@ -124,6 +108,7 @@ The dbt Semantic Layer is proprietary, however, some components of the dbt Seman - [Build your metrics](/docs/build/build-metrics-intro) - [Set up dbt Semantic Layer](docs/use-dbt-semantic-layer/setup-dbt-sl) - [Available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations) +- Demo on [how to define and query metrics with MetricFlow](https://www.loom.com/share/60a76f6034b0441788d73638808e92ac?sid=861a94ac-25eb-4fd8-a310-58e159950f5a) diff --git a/website/docs/docs/use-dbt-semantic-layer/setup-sl.md b/website/docs/docs/use-dbt-semantic-layer/setup-sl.md index a2395d367e7..4c88ee50b25 100644 --- a/website/docs/docs/use-dbt-semantic-layer/setup-sl.md +++ b/website/docs/docs/use-dbt-semantic-layer/setup-sl.md @@ -8,9 +8,6 @@ tags: [Semantic Layer] -import NewSLChanges from '/snippets/_new-sl-changes.md'; - - With the dbt Semantic Layer, you can centrally define business metrics, reduce code duplication and inconsistency, create self-service in downstream tools, and more. Configure the dbt Semantic Layer in dbt Cloud to connect with your integrated partner tool. diff --git a/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md b/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md index 89cd9bc6ddc..152821b7e59 100644 --- a/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md +++ b/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md @@ -4,12 +4,9 @@ id: sl-architecture description: "dbt Semantic Layer product architecture and related questions." sidebar_label: "Architecture" tags: [Semantic Layer] +pagination_next: null --- -import NewSLChanges from '/snippets/_new-sl-changes.md'; - - - @@ -22,12 +19,12 @@ The dbt Semantic Layer allows you to define metrics and use various interfaces t The dbt Semantic Layer includes the following components: -| Components | Information | Developer plans | Team plans | Enterprise plans | License | +| Components | Information | dbt Core users | Developer plans | Team plans | Enterprise plans | License | | --- | --- | :---: | :---: | :---: | --- | -| **[MetricFlow](/docs/build/about-metricflow)** | MetricFlow in dbt allows users to centrally define their semantic models and metrics with YAML specifications. | ✅ | ✅ | ✅ | BSL package (code is source available) | -| **MetricFlow Server**| A proprietary server that takes metric requests and generates optimized SQL for the specific data platform. | ❌ | ✅ | ✅ | Proprietary, Cloud (Team & Enterprise)| -| **Semantic Layer Gateway** | A service that passes queries to MetricFlow server and executes the SQL generated by MetricFlow against the data platform|

❌| ✅ | ✅ | Proprietary, Cloud (Team & Enterprise) | -| **Semantic Layer API** | The interfaces that allow users to submit metric queries include the MetricFlow CLI and JDBC API. They also serve as the foundation for building first-class integrations with various tools. | ❌ | ✅ | ✅ | Proprietary, Cloud (Team & Enterprise)| +| **[MetricFlow](/docs/build/about-metricflow)** | MetricFlow in dbt allows users to centrally define their semantic models and metrics with YAML specifications. | ✅ | ✅ | ✅ | ✅ | BSL package (code is source available) | +| **MetricFlow Server**| A proprietary server that takes metric requests and generates optimized SQL for the specific data platform. | ❌ | ❌ | ✅ | ✅ | Proprietary, Cloud (Team & Enterprise)| +| **Semantic Layer Gateway** | A service that passes queries to the MetricFlow server and executes the SQL generated by MetricFlow against the data platform|

❌ | ❌ |✅ | ✅ | Proprietary, Cloud (Team & Enterprise) | +| **Semantic Layer APIs** | The interfaces allow users to submit metric queries using GraphQL and JDBC APIs. They also serve as the foundation for building first-class integrations with various tools. | ❌ | ❌ | ✅ | ✅ | Proprietary, Cloud (Team & Enterprise)| ## Related questions diff --git a/website/docs/docs/use-dbt-semantic-layer/tableau.md b/website/docs/docs/use-dbt-semantic-layer/tableau.md new file mode 100644 index 00000000000..c93643354aa --- /dev/null +++ b/website/docs/docs/use-dbt-semantic-layer/tableau.md @@ -0,0 +1,72 @@ +--- +title: "Tableau (beta)" +description: "Use Tableau worksheets to query the dbt Semantic Layer and produce dashboards with trusted date." +tags: [Semantic Layer] +sidebar_label: "Tableau (beta)" +--- + +:::info Beta functionality +The Tableau integration with the dbt Semantic Layer is a [beta feature](/docs/dbt-versions/product-lifecycles#dbt-cloud). +::: + + +The Tableau integration allows you to use worksheets to query the Semantic Layer directly and produce your dashboards with trusted data. + +This integration provides a live connection to the dbt Semantic Layer through Tableau Desktop. + +## Prerequisites + +1. You must have [Tableau Desktop](https://www.tableau.com/en-gb/products/desktop) installed with version 2021.1 or greater + - Note that Tableau Online does not currently support custom connectors natively. +2. Log in to Tableau Desktop using either your license or the login details you use for Tableau Server or Tableau Online. +3. You need your dbt Cloud host, [Environment ID](/docs/use-dbt-semantic-layer/setup-sl#set-up-dbt-semantic-layer) and [service token](/docs/dbt-cloud-apis/service-tokens) to log in. This account should be set up with the dbt Semantic Layer. +4. You must have a dbt Cloud Team or Enterprise [account](https://www.getdbt.com/pricing) and multi-tenant [deployment](/docs/cloud/about-cloud/regions-ip-addresses). (Single-Tenant coming soon) + + +## Installing + +1. Download the GitHub [connector file](https://github.com/dbt-labs/semantic-layer-tableau-connector/releases/download/v1.0.2/dbt_semantic_layer.taco) locally and add it to your default folder: + - Windows: `C:\Users\\[Windows User]\Documents\My Tableau Repository\Connectors` + - Mac: `/Users/[user]/Documents/My Tableau Repository/Connectors` + - Linux: `/opt/tableau/connectors` +2. Install the [JDBC driver](/docs/dbt-cloud-apis/sl-jdbc) to the folder based on your operating system: + - Windows: `C:\Program Files\Tableau\Drivers` + - Mac: `~/Library/Tableau/Drivers` + - Linux: ` /opt/tableau/tableau_driver/jdbc` +3. Open Tableau Desktop and find the **dbt Semantic Layer by dbt Labs** connector on the left-hand side. +4. Connect with your Host, Environment ID, and service token information that's provided to you in your dbt Cloud Semantic Layer configuration. + + +## Using the integration + +Once you authenticate, the system will direct you to the data source page with all the metrics and dimensions configured in your Semantic Layer. + +- From there, go directly to a worksheet in the bottom left-hand corner. +- Then, you'll find all the metrics and dimensions that are available to query on the left-hand side of your window. + +Visit the [Tableau documentation](https://help.tableau.com/current/pro/desktop/en-us/gettingstarted_overview.htm) to learn more about how to use Tableau worksheets and dashboards. + +## Things to note + +- All metrics use the "SUM" aggregation type, and this can't be altered. The dbt Semantic Layer controls the aggregation type and it is intentionally fixed. Keep in mind that the underlying aggregation in the dbt Semantic Layer might not be "SUM" (even though "SUM" is Tableau's default). +- Tableau surfaces all metrics and dimensions from the dbt Semantic Layer on the left-hand side. Note, that not all metrics and dimensions can be combined with one another. You will receive an error message if a particular dimension cannot be sliced with a metric (or vice versa). + - To display available metrics and dimensions, dbt Semantic Layer returns metadata for a fake table with the dimensions and metrics as 'columns' on this table. Because of this, you can't actually query this table for previews or extracts. + - Since this is treated as a table, dbt Semantic Layer can't dynamically change what is available. This means we display _all_ available metrics and dimensions even if a particular metric and dimension combination isn't available. + +- Certain Table calculations like "Totals" and "Percent Of" may not be accurate when using metrics aggregated in a non-additive way (such as count distinct) +- In any of our Semantic Layer interfaces (not only Tableau), you must include a [time dimension](/docs/build/cumulative#limitations) when working with any cumulative metric that has a time window or granularity. + +## Unsupported functionality + +The following Tableau features aren't supported at this time, however, the dbt Semantic Layer may support some of this functionality in a future release: + +- Updating the data source page +- Using "Extract" mode to view your data +- Unioning Tables +- Writing Custom SQL +- Table Extensions +- Cross Database Joins +- All functions in Analysis --> Create Calculated Field +- Filtering on a Date Part time dimension for a Cumulative metric type +- Changing your date dimension to use "Week Number" + diff --git a/website/docs/docs/verified-adapters.md b/website/docs/docs/verified-adapters.md index a2d28a612d6..170bc8f885b 100644 --- a/website/docs/docs/verified-adapters.md +++ b/website/docs/docs/verified-adapters.md @@ -13,6 +13,10 @@ The verification process serves as the on-ramp to integration with dbt Cloud. As To learn more, see [Verifying a new adapter](/guides/dbt-ecosystem/adapter-development/7-verifying-a-new-adapter). +import MSCallout from '/snippets/_microsoft-adapters-soon.md'; + + + Here are the verified data platforms that connect to dbt and its latest version. import AdaptersVerified from '/snippets/_adapters-verified.md'; diff --git a/website/docs/faqs/Environments/custom-branch-settings.md b/website/docs/faqs/Environments/custom-branch-settings.md index 95929d2d393..4bc4b85be02 100644 --- a/website/docs/faqs/Environments/custom-branch-settings.md +++ b/website/docs/faqs/Environments/custom-branch-settings.md @@ -1,7 +1,7 @@ --- -title: How do I use the `Custom Branch` settings in a dbt Cloud Environment? +title: How do I use the 'Custom Branch' settings in a dbt Cloud Environment? description: "Use custom code from your repository" -sidebar_label: 'Custom Branch settings' +sidebar_label: 'Custom branch settings' id: custom-branch-settings --- @@ -15,12 +15,21 @@ To specify a custom branch: ## Development -In a development environment, the default branch (commonly the `main` branch) is a read-only branch found in the IDE's connected repositories, which you can use to create development branches. Identifying a custom branch overrides this default behavior. Instead, your custom branch becomes read-only and can be used to create development branches. You will no longer be able to make commits to the custom branch from within the dbt Cloud IDE. +In a development environment, the default branch (usually named `main`) is a read-only branch in your connected repositories, which allows you to create new branches for development from it. -For example, you can use the `develop` branch of a connected repository. Edit an environment, select **Only run on a custom branch** in **General settings** , enter **develop** as the name of your custom branch. +Specifying a **Custom branch** overrides the default behavior. It makes the custom branch 'read-only' and enables you to create new development branches from it. This also means you can't edit this custom branch directly. - +Only one branch can be read-only, which means when you set up a custom branch, your `main` branch (usually read-only) becomes editable. If you want to protect the `main` branch and prevent any commits on it, you need to set up branch protection rules in your git provider settings. This ensures your `main` branch remains secure and no new commits can be made to it. + +For example, if you want to use the `develop` branch of a connected repository: + +- Go to an environment and select **Settings** to edit it +- Select **Only run on a custom branch** in **General settings** +- Enter **develop** as the name of your custom branch +- Click **Save** + + ## Deployment -When running jobs in a deployment environment, dbt will clone your project from your connected repository before executing your models. By default, dbt uses the default branch of your repository (commonly the `main` branch). To specify a different version of your project for dbt to execute during job runs in a particular environment, you can edit the Custom Branch setting as shown in the previous steps. \ No newline at end of file +When running jobs in a deployment environment, dbt will clone your project from your connected repository before executing your models. By default, dbt uses the default branch of your repository (commonly the `main` branch). To specify a different version of your project for dbt to execute during job runs in a particular environment, you can edit the Custom Branch setting as shown in the previous steps. diff --git a/website/docs/faqs/Models/reference-models-in-another-project.md b/website/docs/faqs/Models/reference-models-in-another-project.md deleted file mode 100644 index 19f3f52da31..00000000000 --- a/website/docs/faqs/Models/reference-models-in-another-project.md +++ /dev/null @@ -1,11 +0,0 @@ ---- -title: How can I reference models or macros in another project? -description: "Use packages to add another project to your dbt project" -sidebar_label: 'Reference models or macros in another project' -id: reference-models-in-another-project - ---- - -You can use [packages](/docs/build/packages) to add another project to your dbt -project, including other projects you've created. Check out the [docs](/docs/build/packages) -for more information! diff --git a/website/docs/faqs/Project/which-schema.md b/website/docs/faqs/Project/which-schema.md index f0634ac8c85..2c21cba3c6a 100644 --- a/website/docs/faqs/Project/which-schema.md +++ b/website/docs/faqs/Project/which-schema.md @@ -7,7 +7,7 @@ id: which-schema --- By default, dbt builds models in your target schema. To change your target schema: * If you're developing in **dbt Cloud**, these are set for each user when you first use a development environment. -* If you're developing with the **dbt CLI**, this is the `schema:` parameter in your `profiles.yml` file. +* If you're developing with **dbt Core**, this is the `schema:` parameter in your `profiles.yml` file. If you wish to split your models across multiple schemas, check out the docs on [using custom schemas](/docs/build/custom-schemas). diff --git a/website/docs/faqs/Runs/checking-logs.md b/website/docs/faqs/Runs/checking-logs.md index dbfdb6806a1..ff5e6f5cf04 100644 --- a/website/docs/faqs/Runs/checking-logs.md +++ b/website/docs/faqs/Runs/checking-logs.md @@ -10,7 +10,7 @@ To check out the SQL that dbt is running, you can look in: * dbt Cloud: * Within the run output, click on a model name, and then select "Details" -* dbt CLI: +* dbt Core: * The `target/compiled/` directory for compiled `select` statements * The `target/run/` directory for compiled `create` statements * The `logs/dbt.log` file for verbose logging. diff --git a/website/docs/faqs/Runs/failed-tests.md b/website/docs/faqs/Runs/failed-tests.md index bfee565ef61..d19023d035d 100644 --- a/website/docs/faqs/Runs/failed-tests.md +++ b/website/docs/faqs/Runs/failed-tests.md @@ -10,7 +10,7 @@ To debug a failing test, find the SQL that dbt ran by: * dbt Cloud: * Within the test output, click on the failed test, and then select "Details" -* dbt CLI: +* dbt Core: * Open the file path returned as part of the error message. * Navigate to the `target/compiled/schema_tests` directory for all compiled test queries diff --git a/website/docs/faqs/Warehouse/bq-oauth-drive-scope.md b/website/docs/faqs/Warehouse/bq-oauth-drive-scope.md new file mode 100644 index 00000000000..ae6da82c47a --- /dev/null +++ b/website/docs/faqs/Warehouse/bq-oauth-drive-scope.md @@ -0,0 +1,8 @@ +--- +title: Why does the BigQuery OAuth application require scopes to Google Drive? +description: "Learn more about Google Drive scopes in the BigQuery OAuth application" +sidebar_label: "BigQuery OAuth Drive Scopes" +id: bq-oauth-drive-scope +--- + +BigQuery supports external tables over both personal Google Drive files and shared files. For more information, refer to [Create Google Drive external tables](https://cloud.google.com/bigquery/docs/external-data-drive). diff --git a/website/docs/faqs/Warehouse/database-privileges.md b/website/docs/faqs/Warehouse/database-privileges.md index 73e0549f130..3761b81fe67 100644 --- a/website/docs/faqs/Warehouse/database-privileges.md +++ b/website/docs/faqs/Warehouse/database-privileges.md @@ -12,8 +12,8 @@ schema¹ * read system views to generate documentation (i.e. views in `information_schema`) -On Postgres, Redshift, and Snowflake, use a series of `grants` to ensure that -your user has the correct privileges. +On Postgres, Redshift, Databricks, and Snowflake, use a series of `grants` to ensure that +your user has the correct privileges. Check out [example permissions](/reference/database-permissions/about-database-permissions) for these warehouses. On BigQuery, use the "BigQuery User" role to assign these privileges. diff --git a/website/docs/guides/advanced/using-jinja.md b/website/docs/guides/advanced/using-jinja.md index 40cfd2af298..1cbe88dc9ca 100644 --- a/website/docs/guides/advanced/using-jinja.md +++ b/website/docs/guides/advanced/using-jinja.md @@ -9,7 +9,7 @@ If you'd like to work through this query, add [this CSV](https://github.com/dbt- While working through the steps of this model, we recommend that you have your compiled SQL open as well, to check what your Jinja compiles to. To do this: * **Using dbt Cloud:** Click the compile button to see the compiled SQL in the right hand pane -* **Using the dbt CLI:** Run `dbt compile` from the command line. Then open the compiled SQL file in the `target/compiled/{project name}/` directory. Use a split screen in your code editor to keep both files open at once. +* **Using dbt Core:** Run `dbt compile` from the command line. Then open the compiled SQL file in the `target/compiled/{project name}/` directory. Use a split screen in your code editor to keep both files open at once. ## Write the SQL without Jinja Consider a data model in which an `order` can have many `payments`. Each `payment` may have a `payment_method` of `bank_transfer`, `credit_card` or `gift_card`, and therefore each `order` can have multiple `payment_methods` diff --git a/website/docs/guides/best-practices/debugging-errors.md b/website/docs/guides/best-practices/debugging-errors.md index 39670820ddd..fe600ec4f67 100644 --- a/website/docs/guides/best-practices/debugging-errors.md +++ b/website/docs/guides/best-practices/debugging-errors.md @@ -17,7 +17,7 @@ Learning how to debug is a skill, and one that will make you great at your role! - The `target/run` directory contains the SQL dbt executes to build your models. - The `logs/dbt.log` file contains all the queries that dbt runs, and additional logging. Recent errors will be at the bottom of the file. - **dbt Cloud users**: Use the above, or the `Details` tab in the command output. - - **dbt CLI users**: Note that your code editor _may_ be hiding these files from the tree [VSCode help](https://stackoverflow.com/questions/42891463/how-can-i-show-ignored-files-in-visual-studio-code)). + - **dbt Core users**: Note that your code editor _may_ be hiding these files from the tree [VSCode help](https://stackoverflow.com/questions/42891463/how-can-i-show-ignored-files-in-visual-studio-code)). 5. If you are really stuck, try [asking for help](/community/resources/getting-help). Before doing so, take the time to write your question well so that others can diagnose the problem quickly. @@ -184,7 +184,7 @@ hello: world # this is not allowed ## Compilation Errors -_Note: if you're using the dbt Cloud IDE to work on your dbt project, this error often shows as a red bar in your command prompt as you work on your dbt project. For dbt CLI users, these won't get picked up until you run `dbt run` or `dbt compile`._ +_Note: if you're using the dbt Cloud IDE to work on your dbt project, this error often shows as a red bar in your command prompt as you work on your dbt project. For dbt Core users, these won't get picked up until you run `dbt run` or `dbt compile`._ ### Invalid `ref` function @@ -228,7 +228,7 @@ To fix this: - Use the error message to find your mistake To prevent this: -- _(dbt CLI users only)_ Use snippets to auto-complete pieces of Jinja ([atom-dbt package](https://github.com/dbt-labs/atom-dbt), [vscode-dbt extestion](https://marketplace.visualstudio.com/items?itemName=bastienboutonnet.vscode-dbt)) +- _(dbt Core users only)_ Use snippets to auto-complete pieces of Jinja ([atom-dbt package](https://github.com/dbt-labs/atom-dbt), [vscode-dbt extestion](https://marketplace.visualstudio.com/items?itemName=bastienboutonnet.vscode-dbt))
@@ -280,7 +280,7 @@ To fix this: - Find the mistake and fix it To prevent this: -- (dbt CLI users) Turn on indentation guides in your code editor to help you inspect your files +- (dbt Core users) Turn on indentation guides in your code editor to help you inspect your files - Use a YAML validator ([example](http://www.yamllint.com/)) to debug any issues
@@ -341,10 +341,10 @@ Database Error in model customers (models/customers.sql) 90% of the time, there's a mistake in the SQL of your model. To fix this: 1. Open the offending file: - **dbt Cloud:** Open the model (in this case `models/customers.sql` as per the error message) - - **dbt CLI:** Open the model as above. Also open the compiled SQL (in this case `target/run/jaffle_shop/models/customers.sql` as per the error message) — it can be useful to show these side-by-side in your code editor. + - **dbt Core:** Open the model as above. Also open the compiled SQL (in this case `target/run/jaffle_shop/models/customers.sql` as per the error message) — it can be useful to show these side-by-side in your code editor. 2. Try to re-execute the SQL to isolate the error: - **dbt Cloud:** Use the `Preview` button from the model file - - **dbt CLI:** Copy and paste the compiled query into a query runner (e.g. the Snowflake UI, or a desktop app like DataGrip / TablePlus) and execute it + - **dbt Core:** Copy and paste the compiled query into a query runner (e.g. the Snowflake UI, or a desktop app like DataGrip / TablePlus) and execute it 3. Fix the mistake. 4. Rerun the failed model. @@ -356,7 +356,7 @@ In some cases, these errors might occur as a result of queries that dbt runs "be In these cases, you should check out the logs — this contains _all_ the queries dbt has run. - **dbt Cloud**: Use the `Details` in the command output to see logs, or check the `logs/dbt.log` file -- **dbt CLI**: Open the `logs/dbt.log` file. +- **dbt Core**: Open the `logs/dbt.log` file. :::tip Isolating errors in the logs If you're hitting a strange `Database Error`, it can be a good idea to clean out your logs by opening the file, and deleting the contents. Then, re-execute `dbt run` for _just_ the problematic model. The logs will _just_ have the output you're looking for. @@ -379,6 +379,6 @@ Using the `Preview` button is useful when developing models and you want to visu We’ve all been there. dbt uses the last-saved version of a file when you execute a command. In most code editors, and in the dbt Cloud IDE, a dot next to a filename indicates that a file has unsaved changes. Make sure you hit `cmd + s` (or equivalent) before running any dbt commands — over time it becomes muscle memory. ### Editing compiled files -_(More likely for dbt CLI users)_ +_(More likely for dbt Core users)_ If you just opened a SQL file in the `target/` directory to help debug an issue, it's not uncommon to accidentally edit that file! To avoid this, try changing your code editor settings to grey out any files in the `target/` directory — the visual cue will help avoid the issue. diff --git a/website/docs/guides/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md b/website/docs/guides/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md index 34c0e813725..801227924dd 100644 --- a/website/docs/guides/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md +++ b/website/docs/guides/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md @@ -13,7 +13,7 @@ git clone git@github.com:dbt-labs/jaffle-sl-template.git cd path/to/project ``` -Next before we start writing code, we'll need to install the MetricFlow CLI as an extension of a dbt adapter from PyPI. The MetricFlow CLI is compatible with Python versions 3.8 through 3.11. +Next, before you start writing code, you need to install MetricFlow as an extension of a dbt adapter from PyPI (dbt Core users only). The MetricFlow is compatible with Python versions 3.8 through 3.11. We'll use pip to install MetricFlow and our dbt adapter: @@ -33,7 +33,7 @@ Lastly, to get to the pre-Semantic Layer starting state, checkout the `start-her git checkout start-here ``` -For more information you can [look at the docs](/docs/build/metricflow-cli) or checkout a [Quickstart](https://docs.getdbt.com/quickstarts) to get more familiar with setting up a dbt project. +For more information, refer to the [MetricFlow commands](/docs/build/metricflow-commands) or a [quickstart](/quickstarts) to get more familiar with setting up a dbt project. ## Basic commands diff --git a/website/docs/guides/best-practices/how-we-mesh/mesh-1-intro.md b/website/docs/guides/best-practices/how-we-mesh/mesh-1-intro.md new file mode 100644 index 00000000000..ba1660a8d82 --- /dev/null +++ b/website/docs/guides/best-practices/how-we-mesh/mesh-1-intro.md @@ -0,0 +1,39 @@ +--- +title: "Intro to dbt Mesh" +description: Getting started with dbt Mesh patterns +hoverSnippet: Learn how to get started with dbt Mesh +--- + +## What is dbt Mesh? + +Organizations of all sizes rely upon dbt to manage their data transformations, from small startups to large enterprises. At scale, it can be challenging to coordinate all the organizational and technical requirements demanded by your stakeholders within the scope of a single dbt project. To date, there also hasn't been a first-class way to effectively manage the dependencies, governance, and workflows between multiple dbt projects. + +Regardless of your organization's size and complexity, dbt should empower data teams to work independently and collaboratively; sharing data, code, and best practices without sacrificing security or autonomy. dbt Mesh provides the tooling for teams to finally achieve this. + +dbt Mesh is not a single product: it is a pattern enabled by a convergence of several features in dbt: + +- **[Cross-project references](/docs/collaborate/govern/project-dependencies#how-to-use-ref)** - this is the foundational feature that enables the multi-project deployments. `{{ ref() }}`s now work across dbt Cloud projects on Enterprise plans. +- **[dbt Explorer](/docs/collaborate/explore-projects)** - dbt Cloud's metadata-powered documentation platform, complete with full, cross-project lineage. +- **Governance** - dbt's new governance features allow you to manage access to your dbt models both within and across projects. + - **[Groups](/docs/collaborate/govern/model-access#groups)** - groups allow you to assign models to subsets within a project. + - **[Access](/docs/collaborate/govern/model-access#access-modifiers)** - access configs allow you to control who can reference models. +- **[Model Versions](/docs/collaborate/govern/model-versions)** - when coordinating across projects and teams, we recommend treating your data models as stable APIs. Model versioning is the mechanism to allow graceful adoption and deprecation of models as they evolve. +- **[Model Contracts](/docs/collaborate/govern/model-contracts)** - data contracts set explicit expectations on the shape of the data to ensure data changes upstream of dbt or within a project's logic don't break downstream consumers' data products. + +## Who is dbt Mesh for? + +The multi-project architecture helps organizations with mature, complex transformation workflows in dbt increase the flexibility and performance of their dbt projects. If you're already using dbt and your project has started to experience any of the following, you're likely ready to start exploring this paradigm: + +- The **number of models** in your project is degrading performance and slowing down development. +- Teams have developed **separate workflows** and need to decouple development from each other. +- **Security and governance** requirements are increasing and would benefit from increased isolation. + +dbt Cloud is designed to coordinate the features above and simplify the complexity to solve for these problems. + +If you're just starting your dbt journey, don't worry about building a multi-project architecture right away. You can _incrementally_ adopt the features in this guide as you scale. The collection of features work effectively as independent tools. Familiarizing yourself with the tooling and features that make up a multi-project architecture, and how they can apply to your organization will help you make better decisions as you grow. + +## Learning goals + +- Understand the **purpose and tradeoffs** of building a multi-project architecture. +- Develop an intuition for various **dbt Mesh patterns** and how to design a multi-project architecture for your organization. +- Establish recommended steps to **incrementally adopt** these patterns in your dbt implementation. diff --git a/website/docs/guides/best-practices/how-we-mesh/mesh-2-structures.md b/website/docs/guides/best-practices/how-we-mesh/mesh-2-structures.md new file mode 100644 index 00000000000..937515954af --- /dev/null +++ b/website/docs/guides/best-practices/how-we-mesh/mesh-2-structures.md @@ -0,0 +1,52 @@ +--- +title: Deciding how to structure your dbt Mesh +description: Getting started with dbt Mesh patterns +hoverSnippet: Learn how to get started with dbt Mesh +--- +## Exploring mesh patterns + +When adopting a multi-project architecture, where do you draw the lines between projects? + +How should you organize data workflows in a world where instead of having a single dbt DAG, you have multiple projects speaking to each other, each comprised of their own DAG? + +Adopting the dbt Mesh pattern is not a one-size-fits-all process. In fact, it's the opposite! It's about customizing your project structure to fit _your_ team and _your_ data. Now you can mold your organizational knowledge graph to your organizational people graph, bringing people and data closer together rather than compromising one for the other. + +While there is not a single best way to implement this pattern, there are some common decision points that will be helpful for you to consider. + +At a high level, you’ll need to decide: + +- Where to draw the lines between your dbt Projects -- i.e. how do you determine where to split your DAG and which models go in which project? +- How to manage your code -- do you want multiple dbt Projects living in the same repository (mono-repo) or do you want to have multiple repos with one repo per project? + +## Define your project interfaces by splitting your DAG + +The first (and perhaps most difficult!) decision when migrating to a multi-project architecture is deciding where to draw the line in your DAG to define the interfaces between your projects. Let's explore some language for discussing the design of these patterns. + +### Vertical splits + +Vertical splits separate out layers of transformation in DAG order. Let's look at some examples. + +- **Splitting up staging and mart layers** to create a more tightly-controlled, shared set of components that other projects build on but can't edit. +- **Isolating earlier models for security and governance requirements** to separate out and mask PII data so that downstream consumers can't access it is a common use case for a vertical split. +- **Protecting complex or expensive data** to isolate large or complex models that are expensive to run so that they are safe from accidental selection, independently deployable, and easier to debug when they have issues. + +### Horizontal splits + +Horizontal splits separate your DAG based on source or domain. These splits are often based around the shape and size of the data and how it's used. Let's consider some possibilities for horizontal splitting. + +- **Team consumption patterns.** For example, splitting out the marketing team's data flow into a separate project. +- **Data from different sources.** For example, clickstream event data and transactional ecommerce data may need to be modeled independently of each other. +- **Team workflows.** For example, if two embedded groups operate at different paces, you may want to split the projects up so they can move independently. + +### Combining these strategies + +- **These are not either/or techniques**. You should consider both types of splits, and combine them in any way that makes sense for your organization. +- **Pick one type of split and focus on that first**. If you have a hub-and-spoke team topology for example, handle breaking out the central platform project before you split the remainder into domains. Then if you need to break those domains up horizontally you can focus on that after the fact. +- **DRY applies to underlying data, not just code.** Regardless of your strategy, you should not be sourcing the same rows and columns into multiple nodes. When working within a mesh pattern it becomes increasingly important that we don't duplicate logic or data. + +## Determine your git strategy + +A multi-project architecture can exist in a single repo (monorepo) or as multiple projects, with each one being in their own repository (multi-repo). + +- If you're a **smaller team** looking primarily to speed up and simplify development, a **monorepo** is likely the right choice, but can become unwieldy as the number of projects, models and contributors grow. +- If you’re a **larger team with multiple groups**, and need to decouple projects for security and enablement of different development styles and rhythms, a **multi-repo setup** is your best bet. diff --git a/website/docs/guides/best-practices/how-we-mesh/mesh-3-implementation.md b/website/docs/guides/best-practices/how-we-mesh/mesh-3-implementation.md new file mode 100644 index 00000000000..cfbbc7a1f28 --- /dev/null +++ b/website/docs/guides/best-practices/how-we-mesh/mesh-3-implementation.md @@ -0,0 +1,130 @@ +--- +title: "Implementing your mesh plan" +description: Getting started with dbt Mesh patterns +hoverSnippet: Learn how to get started with dbt Mesh +--- + +As mentioned before, the key decision in migrating to a multi-project architecture is understanding how your project is already being grouped, built, and deployed. We can use this information to inform our decision to split our project apart. + +- **Examine your jobs** - which sets of models are most often built together? +- **Look at your lineage graph** - how are models connected? +- **Look at your selectors** defined in `selectors.yml` - how do people already define resource groups? +- **Talk to teams** about what sort of separation naturally exists right now. + - Are there various domains people are focused on? + - Are there various sizes, shapes, and sources of data that get handled separately (such as click event data)? + - Are there people focused on separate levels of transformation, such as landing and staging data or building marts? + +## Add groups and access + +Once you have a sense of some initial groupings, you can first implement **group and access permissions** within a single project. + +- First you can create a [group](/docs/build/groups) to define the owner of a set of models. + +```yml +# in models/__groups.yml + +groups: + - name: marketing + owner: + - name: Ben Jaffleck + email: ben.jaffleck@jaffleshop.com +``` + +- Then, we can add models to that group using the `group:` key in the model's YAML entry. + +```yml +# in models/marketing/__models.yml + +models: + - name: fct_marketing_model + group: marketing + - name: stg_marketing_model + group: marketing +``` + +- Once you've added models to the group, you can **add [access](/docs/collaborate/govern/model-access) settings to the models** based on their connections between groups, *opting for the most private access that will maintain current functionality*. This means that any model that has *only* relationships to other models in the same group should be `private` , and any model that has cross-group relationships, or is a terminal node in the group DAG should be `protected` so that other parts of the DAG can continue to reference it. + +```yml +# in models/marketing/__models.yml + +models: + - name: fct_marketing_model + group: marketing + access: protected + - name: stg_marketing_model + group: marketing + access: private +``` + +- **Validate these groups by incrementally migrating your jobs** to execute these groups specifically via selection syntax. We would recommend doing this in parallel to your production jobs until you’re sure about them. This will help you feel out if you’ve drawn the lines in the right place. +- If you find yourself **consistently making changes across multiple groups** when you update logic, that’s a sign that **you may want to rethink your groups**. + +## Split your projects + +1. **Move your grouped models into a subfolder**. This will include any model in the selected group, it's associated YAML entry, as well as its parent or child resources as appropriate depending on where this group sits in your DAG. + 1. Note that just like in your dbt project, circular refereneces are not allowed! Project B cannot have parents and children in Project A, for example. +2. **Create a new `dbt_project.yml` file** in the subdirectory. +3. **Copy any macros** used by the resources you moved. +4. **Create a new `packages.yml` file** in your subdirectory with the packages that are used by the resources you moved. +5. **Update `{{ ref }}` functions** — For any model that has a cross-project dependency (this may be in the files you moved, or in the files that remain in your project): + 1. Update the `{{ ref() }}` function to have two arguments, where the first is the name of the source project and the second is the name of the model: e.g. `{{ ref('jaffle_shop', 'my_upstream_model') }}` + 2. Update the upstream, cross-project parents’ `access` configs to `public` , ensuring any project can safely `{{ ref() }}` those models. + 3. We *highly* recommend adding a [model contract](/docs/collaborate/govern/model-contracts) to the upstream models to ensure the data shape is consistent and reliable for your downstream consumers. +6. **Create a `dependencies.yml` file** ([docs](/docs/collaborate/govern/project-dependencies)) for the downstream project, declaring the upstream project as a dependency. + +```yml + +# in dependencies.yml +projects: + - name: jaffle_shop +``` + +### Best practices + +- When you’ve **confirmed the right groups**, it's time to split your projects. + - **Do *one* group at a time**! + - **Do *not* refactor as you migrate**, however tempting that may be. Focus on getting 1-to-1 parity and log any issues you find in doing the migration for later. Once you’ve fully migrated the project then you can start optimizing it for its new life as part of your mesh. +- Start by splitting your project within the same repository for full git tracking and easy reversion if you need to start from scratch. + + +## Connecting existing projects + +Some organizations may already be coordinating across multiple dbt projects. Most often this is via: + +1. Installing parent projects as dbt packages +2. Using `{{ source() }}` functions to read the outputs of a parent project as inputs to a child project. + +This has a few drawbacks: + +1. If using packages, each project has to include *all* resources from *all* projects in its manifest, slowing down dbt and the development cycle. +2. If using sources, there are breakages in the lineage, as there's no real connection between the parent and child projects. + +The migration steps here are much simpler than splitting up a monolith! + +1. If using the `package` method: + 1. In the parent project: + 1. mark all models being referenced downstream as `public` and add a model contract. + 2. In the child project: + 1. Remove the package entry from `packages.yml` + 2. Add the upstream project to your `dependencies.yml` + 3. Update the `{{ ref() }}` functions to models from the upstream project to include the project name argument. +1. If using `source` method: + 1. In the parent project: + 1. mark all models being imported downstream as `public` and add a model contract. + 2. In the child project: + 1. Add the upstream project to your `dependencies.yml` + 2. Replace the `{{ source() }}` functions with cross project `{{ ref() }}` functions. + 3. Remove the unnecessary `source` definitions. + +## Additional Resources +### Our example projects + +We've provided a set of example projects you can use to explore the topics covered here. We've split our [Jaffle Shop](https://github.com/dbt-labs/jaffle-shop) project into 3 separate projects in a multi-repo dbt Mesh. Note that you'll need to leverage dbt Cloud to use multi-project architecture, as cross-project references are powered via dbt Cloud's APIs. + +- **[Platform](https://github.com/dbt-labs/jaffle-shop-mesh-platform)** - containing our centralized staging models. +- **[Marketing](https://github.com/dbt-labs/jaffle-shop-mesh-marketing)** - containing our marketing marts. +- **[Finance](https://github.com/dbt-labs/jaffle-shop-mesh-finance)** - containing our finance marts. + +### dbt-meshify + +We recommend using the `dbt-meshify` [command line tool]() to help you do this. This comes with CLI operations to automate most of the above steps. diff --git a/website/docs/guides/best-practices/materializations/materializations-guide-4-incremental-models.md b/website/docs/guides/best-practices/materializations/materializations-guide-4-incremental-models.md index 603cbc8cda1..cd4264bafd3 100644 --- a/website/docs/guides/best-practices/materializations/materializations-guide-4-incremental-models.md +++ b/website/docs/guides/best-practices/materializations/materializations-guide-4-incremental-models.md @@ -29,7 +29,7 @@ We did our last `dbt build` job on `2022-01-31`, so any new orders since that ru - 🏔️ build the table from the **beginning of time again — a _table materialization_** - Simple and solid, if we can afford to do it (in terms of time, compute, and money — which are all directly correlated in a cloud warehouse). It’s the easiest and most accurate option. - 🤏 find a way to run **just new and updated rows since our previous run — _an_ _incremental materialization_** - - If we _can’t_ realistically afford to run the whole table — due to complex transformations or big source data, it takes too long — then we want to build incrementally. We want to just transform and add the row with id 567 below, _not_ the previous two with ids 123 and 456 that are already in the table. + - If we _can’t_ realistically afford to run the whole table — due to complex transformations or big source data, it takes too long — then we want to build incrementally. We want to just transform and add the row with id 567 below, _not_ the previous two with ids 123 and 234 that are already in the table. | order_id | order_status | customer_id | order_item_id | ordered_at | updated_at | | -------- | ------------ | ----------- | ------------- | ---------- | ---------- | diff --git a/website/docs/guides/best-practices/materializations/materializations-guide-6-examining-builds.md b/website/docs/guides/best-practices/materializations/materializations-guide-6-examining-builds.md index 07811b42594..909618ef8a5 100644 --- a/website/docs/guides/best-practices/materializations/materializations-guide-6-examining-builds.md +++ b/website/docs/guides/best-practices/materializations/materializations-guide-6-examining-builds.md @@ -12,7 +12,7 @@ hoverSnippet: Read this guide to understand how to examine your builds in dbt. - ⌚ dbt keeps track of how **long each model took to build**, when it started, when it finished, its completion status (error, warn, or success), its materialization type, and _much_ more. - 🖼️ This information is stored in a couple files which dbt calls **artifacts**. - 📊 Artifacts contain a ton of information in JSON format, so aren’t easy to read, but **dbt Cloud** packages the most useful bits of information into a tidy **visualization** for you. -- ☁️ If you’re not using Cloud, we can still use the output of the **dbt CLI to understand our runs**. +- ☁️ If you’re not using Cloud, we can still use the output of the **dbt Core CLI to understand our runs**. ### Model Timing @@ -23,9 +23,9 @@ That’s where dbt Cloud’s Model Timing visualization comes in extremely handy - 🧵 This view lets us see our **mapped out in threads** (up to 64 threads, we’re currently running with 4, so we get 4 tracks) over time. You can think of **each thread as a lane on a highway**. - ⌛ We can see above that `customer_status_histories` is **taking by far the most time**, so we may want to go ahead and **make that incremental**. -If you aren’t using dbt Cloud, that’s okay! We don’t get a fancy visualization out of the box, but we can use the output from the dbt CLI to check our model times, and it’s a great opportunity to become familiar with that output. +If you aren’t using dbt Cloud, that’s okay! We don’t get a fancy visualization out of the box, but we can use the output from the dbt Core CLI to check our model times, and it’s a great opportunity to become familiar with that output. -### dbt CLI output +### dbt Core CLI output If you’ve ever run dbt, whether `build`, `test`, `run` or something else, you’ve seen some output like below. Let’s take a closer look at how to read this. diff --git a/website/docs/guides/dbt-ecosystem/dbt-python-snowpark/6-foundational-structure.md b/website/docs/guides/dbt-ecosystem/dbt-python-snowpark/6-foundational-structure.md index e387b208dd1..8a938e10c34 100644 --- a/website/docs/guides/dbt-ecosystem/dbt-python-snowpark/6-foundational-structure.md +++ b/website/docs/guides/dbt-ecosystem/dbt-python-snowpark/6-foundational-structure.md @@ -71,7 +71,7 @@ In this step, we’ll need to create a development branch and set up project lev - `materialized` — Tells dbt how to materialize models when compiling the code before it pushes it down to Snowflake. All models in the `marts` folder will be built as tables. - `tags` — Applies tags at a directory level to all models. All models in the `aggregates` folder will be tagged as `bi` (abbreviation for business intelligence). - `docs` — Specifies the `node_color` either by the plain color name or a hex value. -5. [Materializations](/docs/build/materializations) are strategies for persisting dbt models in a warehouse, with `tables` and `views` being the most commonly utilized types. By default, all dbt models are materialized as views and other materialization types can be configured in the `dbt_project.yml` file or in a model itself. It’s very important to note *Python models can only be materialized as tables or incremental models.* Since all our Python models exist under `marts`, the following portion of our `dbt_project.yml` ensures no errors will occur when we run our Python models. Starting with [dbt version 1.4](/guides/migration/versions/upgrading-to-v1.4#updates-to-python-models), Python files will automatically get materialized as tables even if not explicitly specified. +5. [Materializations](/docs/build/materializations) are strategies for persisting dbt models in a warehouse, with `tables` and `views` being the most commonly utilized types. By default, all dbt models are materialized as views and other materialization types can be configured in the `dbt_project.yml` file or in a model itself. It’s very important to note *Python models can only be materialized as tables or incremental models.* Since all our Python models exist under `marts`, the following portion of our `dbt_project.yml` ensures no errors will occur when we run our Python models. Starting with [dbt version 1.4](/docs/dbt-versions/core-upgrade/upgrading-to-v1.4#updates-to-python-models), Python files will automatically get materialized as tables even if not explicitly specified. ```yaml marts:     diff --git a/website/docs/guides/dbt-ecosystem/sl-partner-integration-guide.md b/website/docs/guides/dbt-ecosystem/sl-partner-integration-guide.md index 68037bfd0cd..936a54465e8 100644 --- a/website/docs/guides/dbt-ecosystem/sl-partner-integration-guide.md +++ b/website/docs/guides/dbt-ecosystem/sl-partner-integration-guide.md @@ -4,11 +4,6 @@ id: "sl-partner-integration-guide" description: Learn about partner integration guidelines, roadmap, and connectivity. --- - -import NewChanges from '/snippets/_new-sl-changes.md'; - - - To fit your tool within the world of the Semantic Layer, dbt Labs offers some best practice recommendations for how to expose metrics and allow users to interact with them seamlessly. :::note @@ -20,7 +15,7 @@ This is an evolving guide that is meant to provide recommendations based on our To build a dbt Semantic Layer integration: -- We offer a [JDBC](/docs/dbt-cloud-apis/sl-jdbc) API (and will soon offer a GraphQL API). Refer to the dedicated [dbt Semantic Layer API](/docs/dbt-cloud-apis/sl-api-overview) for more technical integration details. +- We offer a [JDBC](/docs/dbt-cloud-apis/sl-jdbc) API and [GraphQL API](/docs/dbt-cloud-apis/sl-graphql). Refer to the dedicated [dbt Semantic Layer API](/docs/dbt-cloud-apis/sl-api-overview) for more technical integration details. - Familiarize yourself with the [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) and [MetricFlow](/docs/build/about-metricflow)'s key concepts. There are two main objects: @@ -33,6 +28,14 @@ The dbt Semantic Layer APIs authenticate with `environmentId`, `SERVICE_TOKEN`, We recommend you provide users with separate input fields with these components for authentication (dbt Cloud will surface these parameters for the user). +### Exposing metadata to dbt Labs + +When building an integration, we recommend you expose certain metadata in the request for analytics purposes. Among other items, it is helpful to have the following: + +- Your application's name (such as 'Tableau') +- The email of the person querying your application +- The version of dbt they are on. + ## Best practices on exposing metrics @@ -85,7 +88,7 @@ Allow users to query either one metric alone without dimensions or multiple metr - Allow toggling between metrics/dimensions seamlessly. -- Be clear on exposing what dimensions are queryable with what metrics and hide things that don’t apply, and vice versa. +- Be clear on exposing what dimensions are queryable with what metrics and hide things that don’t apply. (Our APIs provide calls for you to get relevant dimensions for metrics, and vice versa). - Only expose time granularities (monthly, daily, yearly) that match the available metrics. * For example, if a dbt model and its resulting semantic model have a monthly granularity, make sure querying data with a 'daily' granularity isn't available to the user. Our APIs have functionality that will help you surface the correct granularities @@ -109,6 +112,15 @@ For better analysis, it's best to have the context of the metrics close to where - Allow for creating other metadata that’s useful for the metric. We can provide some of this information in our configuration (Display name, Default Granularity for View, Default Time range), but there may be other metadata that your tool wants to provide to make the metric richer. +### Transparency and using compile + +For transparency and additional context, we recommend you have an easy way for the user to obtain the SQL that MetricFlow generates. Depending on what API you are using, you can do this by using our `compile` parameter. This is incredibly powerful and emphasizes transparency and openness, particularly for technically inclined users. + + +### Where filters and optimization + +In the cases where our APIs support either a string or a filter list for the `where` clause, we always recommend that your application utilizes the filter list in order to gain maximum pushdown benefits. The `where` string may be more intuitive for users writing queries during testing, but it will not have the performance benefits of the filter list in a production environment. + ## Example stages of an integration These are recommendations on how to evolve a Semantic Layer integration and not a strict runbook. @@ -136,14 +148,6 @@ These are recommendations on how to evolve a Semantic Layer integration and not * Querying dimensions without metrics and other more advanced querying functionality * Suggest metrics to users based on teams/identity, and so on. -### A note on transparency and using compile - -For transparency and additional context, we recommend you have an easy way for the user to obtain the SQL that MetricFlow generates. Depending on what API you are using, you can do this by using our compile parameter. This is incredibly powerful because we want to be very transparent to the user about what we're doing and do not want to be a black box. This would be mostly beneficial to a technical user. - - -### A note on where filters - -In the cases where our APIs support either a string or a filter list for the `where` clause, we always recommend that your application utilizes the filter list in order to gain maximum pushdown benefits. The `where` string may be more intuitive for users writing queries during testing, but it will not have the performance benefits of the filter list in a production environment. ## Related docs diff --git a/website/docs/guides/migration/sl-migration.md b/website/docs/guides/migration/sl-migration.md index c9def4537a3..56cd6dc9d80 100644 --- a/website/docs/guides/migration/sl-migration.md +++ b/website/docs/guides/migration/sl-migration.md @@ -12,7 +12,7 @@ The legacy Semantic Layer will be deprecated in H2 2023. Additionally, the `dbt_ The metrics specification in dbt Core is changed in v1.6 to support the integration of MetricFlow. It's strongly recommended that you refer to [Build your metrics](/docs/build/build-metrics-intro) and before getting started so you understand the core concepts of the Semantic Layer. -dbt Labs recommends completing these steps in a local dev environment instead of the IDE: +dbt Labs recommends completing these steps in a local dev environment (such as the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation)) instead of the dbt Cloud IDE: 1. Create new Semantic Model configs as YAML files in your dbt project.* 1. Upgrade the metrics configs in your project to the new spec.* @@ -63,11 +63,19 @@ You might need to audit metric values during the migration to ensure that the hi This step is only relevant to users who want the legacy and new semantic layer to run in parallel for a short time. This will let you recreate content in downstream tools like Hex and Mode with minimal downtime. If you do not need to recreate assets in these tools skip to step 5. 1. Create a new deployment environment in dbt Cloud and set the dbt version to 1.6 or higher. -2. Choose `Only run on a custom branch` and point to the branch that has the updated metric definition + +2. Select **Only run on a custom branch** and point to the branch that has the updated metric definition. + 3. Set the deployment schema to a temporary migration schema, such as `tmp_sl_migration`. Optional, you can create a new database for the migration. -4. Create a job to parse your project, such as `dbt parse`, and run it. Make sure this job succeeds, There needs to be a successful job in your environment in order to set up the semantic layer -5. In Account Settings > Projects > Project details click `Configure the Semantic Layer`. Under **Environment**select the deployment environment you created in the previous step. Save your configuration. -6. In the Project details page, click `Generate service token` and grant it `Semantic Layer Only` and `Metadata Only` permissions. Save this token securely - you will need it to connect to the semantic layer. + +4. Create a job to parse your project, such as `dbt parse`, and run it. Make sure this job succeeds. There needs to be a successful job in your environment in order to set up the semantic layer. + +5. Select **Account Settings** -> **Projects** -> **Project details** and choose **Configure the Semantic Layer**. + +6. Under **Environment**, select the deployment environment you created in the previous step. Save your configuration. + +7. In the **Project details** page, click **Generate service token** and grant it **Semantic Layer Only** and **Metadata Only** permissions. Save this token securely. You will need it to connect to the semantic layer. + At this point, both the new semantic layer and the old semantic layer will be running. The new semantic layer will be pointing at your migration branch with the updated metrics definitions. @@ -106,7 +114,7 @@ To learn more about integrating with Hex, check out their [documentation](https: If you created a new environment in [Step 3](#step-3-setup-the-semantic-layer-in-a-new-environment): -3. Update your Environment in Account Settings > Project Details > Edit Semantic Layer Configuration to point to your production environment +3. Update your Environment in **Account Settings** -> **Project Details** -> **Edit Semantic Layer Configuration** to point to your production environment 4. Delete your migration environment. Be sure to update your connection details in any downstream tools to account for the environment change. diff --git a/website/docs/guides/migration/tools/migrating-from-spark-to-databricks.md b/website/docs/guides/migration/tools/migrating-from-spark-to-databricks.md index f5549c58416..cd0577c2d96 100644 --- a/website/docs/guides/migration/tools/migrating-from-spark-to-databricks.md +++ b/website/docs/guides/migration/tools/migrating-from-spark-to-databricks.md @@ -35,7 +35,7 @@ In both dbt Core and dbt Cloud, you can migrate your projects to the Databricks- ### Prerequisites -- Your project must be compatible with dbt 1.0 or greater. Refer to [Upgrading to v1.0](/guides/migration/versions/upgrading-to-v1.0) for details. For the latest version of dbt, refer to [Upgrading to v1.3](/guides/migration/versions/upgrading-to-v1.3). +- Your project must be compatible with dbt 1.0 or greater. Refer to [Upgrading to v1.0](/docs/dbt-versions/core-upgrade/upgrading-to-v1.0) for details. For the latest version of dbt, refer to [Upgrading to v1.3](/docs/dbt-versions/core-upgrade/upgrading-to-v1.3). - For dbt Cloud, you need administrative (admin) privileges to migrate dbt projects. diff --git a/website/docs/guides/migration/versions/00-upgrading-to-v1.7.md b/website/docs/guides/migration/versions/00-upgrading-to-v1.7.md deleted file mode 100644 index 036c734dfb1..00000000000 --- a/website/docs/guides/migration/versions/00-upgrading-to-v1.7.md +++ /dev/null @@ -1,24 +0,0 @@ ---- -title: "Upgrading to v1.7 (beta)" -description: New features and changes in dbt Core v1.7 ---- - -## Resources - -- [Changelog](https://github.com/dbt-labs/dbt-core/blob/8aaed0e29f9560bc53d9d3e88325a9597318e375/CHANGELOG.md) -- [CLI Installation guide](/docs/core/installation) -- [Cloud upgrade guide](/docs/dbt-versions/upgrade-core-in-cloud) -- [Release schedule](https://github.com/dbt-labs/dbt-core/issues/7481) - -## What to know before upgrading - -dbt Labs is committed to providing backward compatibility for all versions 1.x, with the exception of any changes explicitly mentioned below. If you encounter an error upon upgrading, please let us know by [opening an issue](https://github.com/dbt-labs/dbt-core/issues/new). - -### Behavior changes - -**COMING SOON** - -### Quick hits - -**COMING SOON** - diff --git a/website/docs/guides/orchestration/airflow-and-dbt-cloud/1-airflow-and-dbt-cloud.md b/website/docs/guides/orchestration/airflow-and-dbt-cloud/1-airflow-and-dbt-cloud.md index d453106eead..d6760771b79 100644 --- a/website/docs/guides/orchestration/airflow-and-dbt-cloud/1-airflow-and-dbt-cloud.md +++ b/website/docs/guides/orchestration/airflow-and-dbt-cloud/1-airflow-and-dbt-cloud.md @@ -19,7 +19,7 @@ There are [so many great examples](https://gitlab.com/gitlab-data/analytics/-/bl ### Airflow + dbt Cloud API w/Custom Scripts -This has served as a bridge until the fabled Astronomer + dbt Labs-built dbt Cloud provider became generally available [here](https://registry.astronomer.io/providers/dbt-cloud?type=Sensors&utm_campaign=Monthly%20Product%20Updates&utm_medium=email&_hsmi=208603877&utm_content=208603877&utm_source=hs_email). +This has served as a bridge until the fabled Astronomer + dbt Labs-built dbt Cloud provider became generally available [here](https://registry.astronomer.io/providers/dbt%20Cloud/versions/latest). There are many different permutations of this over time: diff --git a/website/docs/guides/orchestration/custom-cicd-pipelines/4-dbt-cloud-job-on-pr.md b/website/docs/guides/orchestration/custom-cicd-pipelines/4-dbt-cloud-job-on-pr.md index b58bab175b3..1a75fdc17ac 100644 --- a/website/docs/guides/orchestration/custom-cicd-pipelines/4-dbt-cloud-job-on-pr.md +++ b/website/docs/guides/orchestration/custom-cicd-pipelines/4-dbt-cloud-job-on-pr.md @@ -94,7 +94,7 @@ Add this as a macro to your project. It takes 2 arguments that lets you control ```sql {# This macro finds PR schemas older than a set date and drops them - The maco defaults to 10 days old, but can be configued with the input argument age_in_days + The macro defaults to 10 days old, but can be configured with the input argument age_in_days Sample usage with different date: dbt run-operation pr_schema_cleanup --args "{'database_to_clean': 'analytics','age_in_days':'15'}" #} diff --git a/website/docs/quickstarts/bigquery-qs.md b/website/docs/quickstarts/bigquery-qs.md index 7f7f9aa7655..546b56c234c 100644 --- a/website/docs/quickstarts/bigquery-qs.md +++ b/website/docs/quickstarts/bigquery-qs.md @@ -73,7 +73,6 @@ In order to let dbt connect to your warehouse, you'll need to generate a keyfile 1. Start the [GCP credentials wizard](https://console.cloud.google.com/apis/credentials/wizard). Make sure your new project is selected in the header. If you do not see your account or project, click your profile picture to the right and verify you are using the correct email account. For **Credential Type**: - From the **Select an API** dropdown, choose **BigQuery API** - Select **Application data** for the type of data you will be accessing - - Select **No, I’m not using them** and click **Next**. - Click **Next** to create a new service account. 2. Create a service account for your new project from the [Service accounts page](https://console.cloud.google.com/projectselector2/iam-admin/serviceaccounts?supportedpurview=project). For more information, refer to [Create a service account](https://developers.google.com/workspace/guides/create-credentials#create_a_service_account) in the Google Cloud docs. As an example for this guide, you can: - Type `dbt-user` as the **Service account name** @@ -89,25 +88,22 @@ In order to let dbt connect to your warehouse, you'll need to generate a keyfile 4. Click **Upload a Service Account JSON File** in settings. 5. Select the JSON file you downloaded in [Generate BigQuery credentials](#generate-bigquery-credentials) and dbt Cloud will fill in all the necessary fields. 6. Click **Test Connection**. This verifies that dbt Cloud can access your BigQuery account. -7. Click **Next** if the test succeeded. If it failed, you might need to go back and regenerate your BigQuery credentials. +7. Click **Next** if the test succeeds. If it fails, you might need to go back and regenerate your BigQuery credentials. ## Set up a dbt Cloud managed repository -## Initialize your dbt project​ and start developing +## Initialize your dbt project Now that you have a repository configured, you can initialize your project and start development in dbt Cloud: 1. Click **Start developing in the IDE**. It might take a few minutes for your project to spin up for the first time as it establishes your git connection, clones your repo, and tests the connection to the warehouse. 2. Above the file tree to the left, click **Initialize dbt project**. This builds out your folder structure with example models. 3. Make your initial commit by clicking **Commit and sync**. Use the commit message `initial commit` and click **Commit**. This creates the first commit to your managed repo and allows you to open a branch where you can add new dbt code. 4. You can now directly query data from your warehouse and execute `dbt run`. You can try this out now: - - Click **+ Create new file**, add this query to the new file, and click **Save as** to save the new file: - ```sql - select * from `dbt-tutorial.jaffle_shop.customers` - ``` - In the command line bar at the bottom, enter `dbt run` and click **Enter**. You should see a `dbt run succeeded` message. + - To confirm the success of the above command, navigate to the BigQuery Console and discover the newly created models. ## Build your first model 1. Under **Version Control** on the left, click **Create branch**. You can name it `add-customers-model`. You need to create a new branch since the main branch is set to read-only mode. @@ -175,7 +171,7 @@ select * from final 6. Enter `dbt run` in the command prompt at the bottom of the screen. You should get a successful run and see the three models. -Later, you can connect your business intelligence (BI) tools to these views and tables so they only read cleaned up data rather than raw data in your BI tool. +Later, you can connect your business intelligence (BI) tools to these views and tables so they only read cleaned-up data rather than raw data in your BI tool. #### FAQs @@ -283,7 +279,7 @@ Later, you can connect your business intelligence (BI) tools to these views and 4. Execute `dbt run`. - This time, when you performed a `dbt run`, separate views/tables were created for `stg_customers`, `stg_orders` and `customers`. dbt inferred the order to run these models. Because `customers` depends on `stg_customers` and `stg_orders`, dbt builds `customers` last. You do not need to explicitly define these dependencies. + This time, when you performed a `dbt run`, separate views/tables were created for `stg_customers`, `stg_orders`, and `customers`. dbt inferred the order to run these models. Because `customers` depends on `stg_customers` and `stg_orders`, dbt builds `customers` last. You do not need to explicitly define these dependencies. #### FAQs {#faq-2} diff --git a/website/docs/quickstarts/manual-install-qs.md b/website/docs/quickstarts/manual-install-qs.md index 05336178ff6..fc43d38115b 100644 --- a/website/docs/quickstarts/manual-install-qs.md +++ b/website/docs/quickstarts/manual-install-qs.md @@ -9,11 +9,11 @@ hide_table_of_contents: true --- ## Introduction -When you use dbt Core to work with dbt, you will be editing files locally using a code editor, and running projects using the dbt command line interface (dbt CLI). If you'd rather edit files and run projects using the web-based Integrated Development Environment (IDE), you should refer to the [dbt Cloud quickstarts](/quickstarts). +When you use dbt Core to work with dbt, you will be editing files locally using a code editor, and running projects using a command line interface (CLI). If you'd rather edit files and run projects using the web-based Integrated Development Environment (IDE), you should refer to the [dbt Cloud quickstarts](/quickstarts). You can also develop and run dbt commands using the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) — a dbt Cloud powered command line. ### Prerequisites -* To use the dbt CLI, it's important that you know some basics of the Terminal. In particular, you should understand `cd`, `ls` and `pwd` to navigate through the directory structure of your computer easily. +* To use dbt Core, it's important that you know some basics of the Terminal. In particular, you should understand `cd`, `ls` and `pwd` to navigate through the directory structure of your computer easily. * Install dbt Core using the [installation instructions](/docs/core/installation) for your operating system. * Complete [Setting up (in BigQuery)](/quickstarts/bigquery?step=2) and [Loading data (BigQuery)](/quickstarts/bigquery?step=3). * [Create a GitHub account](https://github.com/join) if you don't already have one. @@ -103,16 +103,16 @@ When developing locally, dbt connects to your using jaffle_shop: # this needs to match the profile in your dbt_project.yml file target: dev outputs: - dev: - type: bigquery - method: service-account - keyfile: /Users/BBaggins/.dbt/dbt-tutorial-project-331118.json # replace this with the full path to your keyfile - project: grand-highway-265418 # Replace this with your project id - dataset: dbt_bbagins # Replace this with dbt_your_name, e.g. dbt_bilbo - threads: 1 - timeout_seconds: 300 - location: US - priority: interactive + dev: + type: bigquery + method: service-account + keyfile: /Users/BBaggins/.dbt/dbt-tutorial-project-331118.json # replace this with the full path to your keyfile + project: grand-highway-265418 # Replace this with your project id + dataset: dbt_bbagins # Replace this with dbt_your_name, e.g. dbt_bilbo + threads: 1 + timeout_seconds: 300 + location: US + priority: interactive ``` @@ -196,7 +196,7 @@ $ git checkout -b add-customers-model 4. From the command line, enter `dbt run`.
- +
When you return to the BigQuery console, you can `select` from this model. diff --git a/website/docs/reference/artifacts/manifest-json.md b/website/docs/reference/artifacts/manifest-json.md index 5e8dcedd2d5..47a9849eda5 100644 --- a/website/docs/reference/artifacts/manifest-json.md +++ b/website/docs/reference/artifacts/manifest-json.md @@ -3,15 +3,9 @@ title: "Manifest JSON file" sidebar_label: "Manifest" --- -| dbt Core version | Manifest version | -|------------------|---------------------------------------------------------------| -| v1.6 | [v10](https://schemas.getdbt.com/dbt/manifest/v10/index.html) | -| v1.5 | [v9](https://schemas.getdbt.com/dbt/manifest/v9/index.html) | -| v1.4 | [v8](https://schemas.getdbt.com/dbt/manifest/v8/index.html) | -| v1.3 | [v7](https://schemas.getdbt.com/dbt/manifest/v7/index.html) | -| v1.2 | [v6](https://schemas.getdbt.com/dbt/manifest/v6/index.html) | -| v1.1 | [v5](https://schemas.getdbt.com/dbt/manifest/v5/index.html) | -| v1.0 | [v4](https://schemas.getdbt.com/dbt/manifest/v4/index.html) | +import ManifestVersions from '/snippets/_manifest-versions.md'; + + **Produced by:** Any command that parses your project. This includes all commands **except** [`deps`](/reference/commands/deps), [`clean`](/reference/commands/clean), [`debug`](/reference/commands/debug), [`init`](/reference/commands/init) diff --git a/website/docs/reference/artifacts/other-artifacts.md b/website/docs/reference/artifacts/other-artifacts.md index d776bc8a099..205bdfc1a14 100644 --- a/website/docs/reference/artifacts/other-artifacts.md +++ b/website/docs/reference/artifacts/other-artifacts.md @@ -39,4 +39,8 @@ This file is useful for investigating performance issues in dbt Core's graph alg It is more anonymized and compact than [`manifest.json`](/reference/artifacts/manifest-json) and [`graph.gpickle`](#graph.gpickle). -It contains only the `name` and `type` of each node along with IDs of its child nodes (`succ`). It includes that information at two separate points in time: immediately after the graph is linked together (`linked`), and after test edges have been added (`with_test_edges`). +It includes that information at two separate points in time: +1. `linked` — immediately after the graph is linked together, and +2. `with_test_edges` — after test edges have been added. + +Each of those points in time contains the `name` and `type` of each node and `succ` contains the keys of its child nodes. diff --git a/website/docs/reference/commands/clone.md b/website/docs/reference/commands/clone.md index a3c8bb236c7..9852ce84c17 100644 --- a/website/docs/reference/commands/clone.md +++ b/website/docs/reference/commands/clone.md @@ -21,7 +21,7 @@ The `clone` command is useful for: dbt clone --state path/to/artifacts # clone one_specific_model of my models from specified state to my target schema(s) -dbt clone --select one_specific_model --state path/to/artifacts +dbt clone --select "one_specific_model" --state path/to/artifacts # clone all of my models from specified state to my target schema(s) and recreate all pre-existing relations in the current target dbt clone --state path/to/artifacts --full-refresh @@ -37,3 +37,5 @@ Unlike deferral, `dbt clone` requires some compute and creation of additional ob For example, by creating actual data warehouse objects, `dbt clone` allows you to test out your code changes on downstream dependencies _outside of dbt_ (such as a BI tool). As another example, you could `clone` your modified incremental models as the first step of your dbt Cloud CI job to prevent costly `full-refresh` builds for warehouses that support zero-copy cloning. + +Check out [this Developer blog post](https://docs.getdbt.com/blog/to-defer-or-to-clone) for more details on best practices when to use `dbt clone` vs. deferral. diff --git a/website/docs/reference/commands/cmd-docs.md b/website/docs/reference/commands/cmd-docs.md index 754c5e93baf..bc4840464b8 100644 --- a/website/docs/reference/commands/cmd-docs.md +++ b/website/docs/reference/commands/cmd-docs.md @@ -19,6 +19,18 @@ The command is responsible for generating your project's documentation website b dbt docs generate ``` + + +Use the `--select` argument to limit the nodes included within `catalog.json`. When this flag is provided, step (3) will be restricted to the selected nodes. All other nodes will be excluded. Step (2) is unaffected. + +**Example**: +```shell +dbt docs generate --select +orders +``` + + + + Use the `--no-compile` argument to skip re-compilation. When this flag is provided, `dbt docs generate` will skip step (2) described above. **Example**: diff --git a/website/docs/reference/commands/compile.md b/website/docs/reference/commands/compile.md index ed403d2af32..cde65b7c6b6 100644 --- a/website/docs/reference/commands/compile.md +++ b/website/docs/reference/commands/compile.md @@ -29,7 +29,7 @@ This will log the compiled SQL to the terminal, in addition to writing to the `t For example: ```bash -dbt compile --select stg_payments +dbt compile --select "stg_payments" dbt compile --inline "select * from {{ ref('raw_orders') }}" ``` @@ -37,7 +37,7 @@ returns the following: ```bash -dbt compile --select stg_orders +dbt compile --select "stg_orders" 21:17:09 Running with dbt=1.5.0-b5 21:17:09 Found 5 models, 20 tests, 0 snapshots, 0 analyses, 425 macros, 0 operations, 3 seed files, 0 sources, 0 exposures, 0 metrics, 0 groups 21:17:09 diff --git a/website/docs/reference/commands/deps.md b/website/docs/reference/commands/deps.md index 4c7a36606e2..f4f8153c115 100644 --- a/website/docs/reference/commands/deps.md +++ b/website/docs/reference/commands/deps.md @@ -57,3 +57,31 @@ Installing calogica/dbt_date@0.4.0 Updates available for packages: ['tailsdotcom/dbt_artifacts', 'dbt-labs/snowplow'] Update your versions in packages.yml, then run dbt deps ``` + + + +dbt generates the `package-lock.yml` file in the _project_root_ where `packages.yml` is recorded, which contains all the resolved packages, the first time you run `dbt deps`. Each subsequent run records the packages installed in this file. If the subsequent `dbt deps` runs contain no updated packages in `depenedencies.yml` or `packages.yml`, dbt-core installs from `package-lock.yml`. + +When you update the package spec and run `dbt deps` again, the package-lock and package files update accordingly. You can run `dbt deps --lock` to update the `package-lock.yml` with the most recent dependencies from `packages`. + +The `--add` flag allows you to add a package to the `packages.yml` with configurable `--version` and `--source` information. The `--dry-run` flag, when set to `False`(default), recompiles the `package-lock.yml` file after a new package is added to the `packages.yml` file. Set the flag to `True` for the changes to not persist. + +Examples of the `--add` flag: +```shell +# add package from hub (--source arg defaults to "hub") +dbt deps add --package dbt-labs/dbt_utils --version 1.0.0 + +# add package from hub with semantic version +dbt deps add --package dbt-labs/snowplow --version ">=0.7.0,<0.8.0" + +# add package from git +dbt deps add --package https://github.com/fivetran/dbt_amplitude --version v0.3.0 --source git + +# add package from local (--version not required for local) +dbt deps add --package /opt/dbt/redshift --source local + +# add package to packages.yml WITHOUT updating package-lock.yml +dbt deps add --package dbt-labs/dbt_utils --version 1.0.0 --dry-run True + +``` + \ No newline at end of file diff --git a/website/docs/reference/commands/init.md b/website/docs/reference/commands/init.md index 873647814ec..ac55717c0ec 100644 --- a/website/docs/reference/commands/init.md +++ b/website/docs/reference/commands/init.md @@ -17,10 +17,21 @@ Then, it will: - Create a new folder with your project name and sample files, enough to get you started with dbt - Create a connection profile on your local machine. The default location is `~/.dbt/profiles.yml`. Read more in [configuring your profile](/docs/core/connect-data-platform/connection-profiles). + + +When using `dbt init` to initialize your project, include the `--profile` flag to specify an existing `profiles.yml` as the `profile:` key to use instead of creating a new one. For example, `dbt init --profile`. + + + +If the profile does not exist in `profiles.yml` or the command is run inside an existing project, the command raises an error. + + + ## Existing project If you've just cloned or downloaded an existing dbt project, `dbt init` can still help you set up your connection profile so that you can start working quickly. It will prompt you for connection information, as above, and add a profile (using the `profile` name from the project) to your local `profiles.yml`, or create the file if it doesn't already exist. + ## profile_template.yml `dbt init` knows how to prompt for connection information by looking for a file named `profile_template.yml`. It will look for this file in two places: diff --git a/website/docs/reference/commands/list.md b/website/docs/reference/commands/list.md index 6084b3dec70..5caabdc2b2e 100644 --- a/website/docs/reference/commands/list.md +++ b/website/docs/reference/commands/list.md @@ -8,9 +8,10 @@ id: "list" The `dbt ls` command lists resources in your dbt project. It accepts selector arguments that are similar to those provided in [dbt run](/reference/commands/run). `dbt list` is an alias for `dbt ls`. While `dbt ls` will read your [connection profile](/docs/core/connect-data-platform/connection-profiles) to resolve [`target`](/reference/dbt-jinja-functions/target)-specific logic, this command will not connect to your database or run any queries. ### Usage + ``` dbt ls - [--resource-type {model,source,seed,snapshot,metric,test,exposure,analysis,default,all}] + [--resource-type {model,semantic_model,source,seed,snapshot,metric,test,exposure,analysis,default,all}] [--select SELECTION_ARG [SELECTION_ARG ...]] [--models SELECTOR [SELECTOR ...]] [--exclude SELECTOR [SELECTOR ...]] @@ -85,7 +86,7 @@ $ dbt ls --select snowplow.* --output json --output-keys "name resource_type des ``` -$ dbt ls --select snowplow.* --output json --output-keys name resource_type description +$ dbt ls --select snowplow.* --output json --output-keys "name resource_type description" {"name": "snowplow_events", "description": "This is a pretty cool model", ...} {"name": "snowplow_page_views", "description": "This model is even cooler", ...} ... @@ -93,6 +94,16 @@ $ dbt ls --select snowplow.* --output json --output-keys name resource_type desc + + +**Listing Semantic models** + +List all resources upstream of your orders semantic model: +``` +dbt ls -s +semantic_model:orders +``` + + **Listing file paths** ``` diff --git a/website/docs/reference/commands/retry.md b/website/docs/reference/commands/retry.md index d494a46cf1f..8da5d5a77a6 100644 --- a/website/docs/reference/commands/retry.md +++ b/website/docs/reference/commands/retry.md @@ -4,14 +4,6 @@ sidebar_label: "retry" id: "retry" --- -:::info Support in dbt Cloud - -`dbt retry` is supported in the dbt Cloud IDE. - -Native support for restarting scheduled runs from point of failure is currently in development & coming soon. - -::: - `dbt retry` re-executes the last `dbt` command from the node point of failure. If the previously executed `dbt` command was successful, `retry` will finish as `no operation`. Retry works with the following commands: diff --git a/website/docs/reference/commands/seed.md b/website/docs/reference/commands/seed.md index 8a410706842..d0cd199ea12 100644 --- a/website/docs/reference/commands/seed.md +++ b/website/docs/reference/commands/seed.md @@ -12,7 +12,7 @@ The `dbt seed` command will load `csv` files located in the `seed-paths` directo Specific seeds can be run using the `--select` flag to `dbt seed`. Example: ``` -$ dbt seed --select country_codes +$ dbt seed --select "country_codes" Found 2 models, 3 tests, 0 archives, 0 analyses, 53 macros, 0 operations, 2 seed files 14:46:15 | Concurrency: 1 threads (target='dev') diff --git a/website/docs/reference/commands/show.md b/website/docs/reference/commands/show.md index 5bdcfacc1e8..a0e5d68c83f 100644 --- a/website/docs/reference/commands/show.md +++ b/website/docs/reference/commands/show.md @@ -16,7 +16,7 @@ The results of the preview query are not materialized in the data warehouse, or Example: ``` -dbt show --select model_name.sql +dbt show --select "model_name.sql" ``` or ``` @@ -26,7 +26,7 @@ dbt show --inline "select * from {{ ref('model_name') }}" The following is an example of `dbt show` output for a model named `stg_orders`: ```bash -dbt show --select stg_orders +dbt show --select "stg_orders" 21:17:38 Running with dbt=1.5.0-b5 21:17:38 Found 5 models, 20 tests, 0 snapshots, 0 analyses, 425 macros, 0 operations, 3 seed files, 0 sources, 0 exposures, 0 metrics, 0 groups 21:17:38 @@ -46,7 +46,7 @@ dbt show --select stg_orders For example, if you've just built a model that has a failing test, you can quickly preview the test failures right in the terminal, to find values of `id` that are duplicated: ```bash -$ dbt build -s my_model_with_duplicates +$ dbt build -s "my_model_with_duplicates" 13:22:47 Running with dbt=1.5.0 ... 13:22:48 Completed with 1 error and 0 warnings: @@ -58,7 +58,7 @@ $ dbt build -s my_model_with_duplicates 13:22:48 13:22:48 Done. PASS=1 WARN=0 ERROR=1 SKIP=0 TOTAL=2 -$ dbt show -s unique_my_model_with_duplicates_id +$ dbt show -s "unique_my_model_with_duplicates_id" 13:22:53 Running with dbt=1.5.0 13:22:53 Found 4 models, 2 tests, 0 snapshots, 0 analyses, 309 macros, 0 operations, 0 seed files, 0 sources, 0 exposures, 0 metrics, 0 groups 13:22:53 diff --git a/website/docs/reference/commands/source.md b/website/docs/reference/commands/source.md index b29bf7dadc6..697ae2b5fcc 100644 --- a/website/docs/reference/commands/source.md +++ b/website/docs/reference/commands/source.md @@ -20,10 +20,10 @@ By default, `dbt source freshness` will calculate freshness information for all ```bash # Snapshot freshness for all Snowplow tables: -$ dbt source freshness --select source:snowplow +$ dbt source freshness --select "source:snowplow" # Snapshot freshness for a particular source table: -$ dbt source freshness --select source:snowplow.event +$ dbt source freshness --select "source:snowplow.event" ``` ### Configuring source freshness output diff --git a/website/docs/reference/commands/test.md b/website/docs/reference/commands/test.md index a1a63729568..c050d82a0ab 100644 --- a/website/docs/reference/commands/test.md +++ b/website/docs/reference/commands/test.md @@ -10,22 +10,22 @@ The tests to run can be selected using the `--select` flag discussed [here](/ref ```bash # run tests for one_specific_model -dbt test --select one_specific_model +dbt test --select "one_specific_model" # run tests for all models in package -dbt test --select some_package.* +dbt test --select "some_package.*" # run only tests defined singularly -dbt test --select test_type:singular +dbt test --select "test_type:singular" # run only tests defined generically -dbt test --select test_type:generic +dbt test --select "test_type:generic" # run singular tests limited to one_specific_model -dbt test --select one_specific_model,test_type:singular +dbt test --select "one_specific_model,test_type:singular" # run generic tests limited to one_specific_model -dbt test --select one_specific_model,test_type:generic +dbt test --select "one_specific_model,test_type:generic" ``` For more information on writing tests, see the [Testing Documentation](/docs/build/tests). diff --git a/website/docs/reference/database-permissions/about-database-permissions.md b/website/docs/reference/database-permissions/about-database-permissions.md new file mode 100644 index 00000000000..76fff517646 --- /dev/null +++ b/website/docs/reference/database-permissions/about-database-permissions.md @@ -0,0 +1,36 @@ +--- +title: "Database permissions" +id: about-database-permissions +description: "Database permissions are access rights and privileges granted to users or roles within a database management system." +sidebar_label: "About database permissions" +pagination_next: "reference/database-permissions/databricks-permissions" +pagination_prev: null +--- + +Database permissions are access rights and privileges granted to users or roles within a database or data platform. They help you specify what actions users or roles can perform on various database objects, like tables, views, schemas, or even the entire database. + + +### Why are they useful + +- Database permissions are essential for security and data access control. +- They ensure that only authorized users can perform specific actions. +- They help maintain data integrity, prevent unauthorized changes, and limit exposure to sensitive data. +- Permissions also support compliance with data privacy regulations and auditing. + +### How to use them + +- Users and administrators can grant and manage permissions at various levels (such as table, schema, and so on) using SQL statements or through the database system's interface. +- Assign permissions to individual users or roles (groups of users) based on their responsibilities. + - Typical permissions include "SELECT" (read), "INSERT" (add data), "UPDATE" (modify data), "DELETE" (remove data), and administrative rights like "CREATE" and "DROP." +- Users should be assigned permissions that ensure they have the necessary access to perform their tasks without overextending privileges. + +Something to note is that each data platform provider might have different approaches and names for privileges. Refer to their documentation for more details. + +### Examples + +Refer to the following database permission pages for more info on examples and how to set up database permissions: + +- [Databricks](/reference/database-permissions/databricks-permissions) +- [Postgres](/reference/database-permissions/postgres-permissions) +- [Redshift](/reference/database-permissions/redshift-permissions) +- [Snowflake](/reference/database-permissions/snowflake-permissions) diff --git a/website/docs/reference/database-permissions/databricks-permissions.md b/website/docs/reference/database-permissions/databricks-permissions.md new file mode 100644 index 00000000000..12e24652ae3 --- /dev/null +++ b/website/docs/reference/database-permissions/databricks-permissions.md @@ -0,0 +1,20 @@ +--- +title: "Databricks permissions" +--- + +In Databricks, permissions are used to control who can perform certain actions on different database objects. Use SQL statements to manage permissions in a Databricks database. + +## Example Databricks permissions + +The following example provides you with the SQL statements you can use to manage permissions. + +**Note** that you can grant permissions on `securable_objects` to `principals` (This can be user, service principal, or group). For example, `grant privilege_type` on `securable_object` to `principal`. + +``` + +grant all privileges on schema schema_name to principal; +grant create table on schema schema_name to principal; +grant create view on schema schema_name to principal; +``` + +Check out the [official documentation](https://docs.databricks.com/en/data-governance/unity-catalog/manage-privileges/privileges.html#privilege-types-by-securable-object-in-unity-catalog) for more information. diff --git a/website/docs/reference/database-permissions/postgres-permissions.md b/website/docs/reference/database-permissions/postgres-permissions.md new file mode 100644 index 00000000000..da56e9b45f2 --- /dev/null +++ b/website/docs/reference/database-permissions/postgres-permissions.md @@ -0,0 +1,25 @@ +--- +title: "Postgres Permissions" +--- + + +In Postgres, permissions are used to control who can perform certain actions on different database objects. Use SQL statements to manage permissions in a Postgres database. + +## Example Postgres permissions + +The following example provides you with the SQL statements you can use to manage permissions. These examples allow you to run dbt smoothly without encountering permission issues, such as creating schemas, reading existing data, and accessing the information schema. + +**Note** that `database_name`, `database.schema_name`, and `user_name` are placeholders and you can replace them as needed for your organization's naming convention. + +``` +grant usage on database database_name to user_name; +grant create schema on database database_name to user_name; +grant usage on schema database.schema_name to user_name; +grant create table on schema database.schema_name to user_name; +grant create view on schema database.schema_name to user_name; +grant usage on all schemas in database database_name to user_name; +grant select on all tables in database database_name to user_name; +grant select on all views in database database_name to user_name; +``` + +Check out the [official documentation](https://www.postgresql.org/docs/current/sql-grant.html) for more information. diff --git a/website/docs/reference/database-permissions/redshift-permissions.md b/website/docs/reference/database-permissions/redshift-permissions.md new file mode 100644 index 00000000000..5f0949a3528 --- /dev/null +++ b/website/docs/reference/database-permissions/redshift-permissions.md @@ -0,0 +1,25 @@ +--- +title: "Redshift permissions" +--- + +In Redshift, permissions are used to control who can perform certain actions on different database objects. Use SQL statements to manage permissions in a Redshift database. + +## Example Redshift permissions + +The following example provides you with the SQL statements you can use to manage permissions. + +**Note** that `database_name`, `database.schema_name`, and `user_name` are placeholders and you can replace them as needed for your organization's naming convention. + + +``` +grant usage on database database_name to user_name; +grant create schema on database database_name to user_name; +grant usage on schema database.schema_name to user_name; +grant create table on schema database.schema_name to user_name; +grant create view on schema database.schema_name to user_name; +grant usage on all schemas in database database_name to user_name; +grant select on all tables in database database_name to user_name; +grant select on all views in database database_name to user_name; +``` + +Check out the [official documentation](https://docs.aws.amazon.com/redshift/latest/dg/r_GRANT.html) for more information. diff --git a/website/docs/reference/database-permissions/snowflake-permissions.md b/website/docs/reference/database-permissions/snowflake-permissions.md new file mode 100644 index 00000000000..3f474242834 --- /dev/null +++ b/website/docs/reference/database-permissions/snowflake-permissions.md @@ -0,0 +1,154 @@ +--- +title: "Snowflake permissions" +--- + +In Snowflake, permissions are used to control who can perform certain actions on different database objects. Use SQL statements to manage permissions in a Snowflake database. + +## Set up Snowflake account + +This section explains how to set up permissions and roles within Snowflake. In Snowflake, you would perform these actions using SQL commands and set up your data warehouse and access control within Snowflake's ecosystem. + +1. Set up databases +``` +use role sysadmin; +create database raw; +create database analytics; +``` +2. Set up warehouses +``` +create warehouse loading + warehouse_size = xsmall + auto_suspend = 3600 + auto_resume = false + initially_suspended = true; + +create warehouse transforming + warehouse_size = xsmall + auto_suspend = 60 + auto_resume = true + initially_suspended = true; + +create warehouse reporting + warehouse_size = xsmall + auto_suspend = 60 + auto_resume = true + initially_suspended = true; +``` + +3. Set up roles and warehouse permissions +``` +use role securityadmin; + +create role loader; +grant all on warehouse loading to role loader; + +create role transformer; +grant all on warehouse transforming to role transformer; + +create role reporter; +grant all on warehouse reporting to role reporter; +``` + +4. Create users, assigning them to their roles + +Every person and application gets a separate user and is assigned to the correct role. + +``` +create user stitch_user -- or fivetran_user + password = '_generate_this_' + default_warehouse = loading + default_role = loader; + +create user claire -- or amy, jeremy, etc. + password = '_generate_this_' + default_warehouse = transforming + default_role = transformer + must_change_password = true; + +create user dbt_cloud_user + password = '_generate_this_' + default_warehouse = transforming + default_role = transformer; + +create user looker_user -- or mode_user etc. + password = '_generate_this_' + default_warehouse = reporting + default_role = reporter; + +-- then grant these roles to each user +grant role loader to user stitch_user; -- or fivetran_user +grant role transformer to user dbt_cloud_user; +grant role transformer to user claire; -- or amy, jeremy +grant role reporter to user looker_user; -- or mode_user, periscope_user +``` + +5. Let loader load data +Give the role unilateral permission to operate on the raw database +``` +use role sysadmin; +grant all on database raw to role loader; +``` + +6. Let transformer transform data +The transformer role needs to be able to read raw data. + +If you do this before you have any data loaded, you can run: +``` +grant usage on database raw to role transformer; +grant usage on future schemas in database raw to role transformer; +grant select on future tables in database raw to role transformer; +grant select on future views in database raw to role transformer; +``` +If you already have data loaded in the raw database, make sure also you run the following to update the permissions +``` +grant usage on all schemas in database raw to role transformer; +grant select on all tables in database raw to role transformer; +grant select on all views in database raw to role transformer; +``` +transformer also needs to be able to create in the analytics database: +``` +grant all on database analytics to role transformer; +``` +7. Let reporter read the transformed data +A previous version of this article recommended this be implemented through hooks in dbt, but this way lets you get away with a one-off statement. +``` +grant usage on database analytics to role reporter; +grant usage on future schemas in database analytics to role reporter; +grant select on future tables in database analytics to role reporter; +grant select on future views in database analytics to role reporter; +``` +Again, if you already have data in your analytics database, make sure you run: +``` +grant usage on all schemas in database analytics to role reporter; +grant select on all tables in database analytics to role transformer; +grant select on all views in database analytics to role transformer; +``` +8. Maintain +When new users are added, make sure you add them to the right role! Everything else should be inherited automatically thanks to those `future` grants. + +For more discussion and legacy information, refer to [this Discourse article](https://discourse.getdbt.com/t/setting-up-snowflake-the-exact-grant-statements-we-run/439). + +## Example Snowflake permissions + +The following example provides you with the SQL statements you can use to manage permissions. + +**Note** that `warehouse_name`, `database_name`, and `role_name` are placeholders and you can replace them as needed for your organization's naming convention. + +``` + +grant all on warehouse warehouse_name to role role_name; +grant usage on database database_name to role role_name; +grant create schema on database database_name to role role_name; +grant usage on schema database.an_existing_schema to role role_name; +grant create table on schema database.an_existing_schema to role role_name; +grant create view on schema database.an_existing_schema to role role_name; +grant usage on future schemas in database database_name to role role_name; +grant monitor on future schemas in database database_name to role role_name; +grant select on future tables in database database_name to role role_name; +grant select on future views in database database_name to role role_name; +grant usage on all schemas in database database_name to role role_name; +grant monitor on all schemas in database database_name to role role_name; +grant select on all tables in database database_name to role role_name; +grant select on all views in database database_name to role role_name; +``` + diff --git a/website/docs/reference/dbt-commands.md b/website/docs/reference/dbt-commands.md index 862829ef809..4bc3ddc24d7 100644 --- a/website/docs/reference/dbt-commands.md +++ b/website/docs/reference/dbt-commands.md @@ -2,54 +2,59 @@ title: "dbt Command reference" --- -dbt is typically run one of two ways: +You can run dbt using the following tools: -* In [dbt Cloud](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) -* On the [command line interface](/docs/core/about-the-cli) (CLI) +- In your browser with the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) +- On the command line interface using the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) or open-source [dbt Core](/docs/core/about-dbt-core), both of which enable you to execute dbt commands. The key distinction is the dbt Cloud CLI is tailored for dbt Cloud's infrastructure and integrates with all its [features](/docs/cloud/about-cloud/dbt-cloud-features). The following sections outline the commands supported by dbt and their relevant flags. For information about selecting models on the command line, consult the docs on [Model selection syntax](/reference/node-selection/syntax). ### Available commands - - -Use the following dbt commands in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) or [CLI](/docs/core/about-the-cli). Use the `dbt` prefix. For example, to run the `test` command, type `dbt test`. - -| Command | Description | Version | -| ------- | ----------- | ------- | -| [build](/reference/commands/build) | Build and test all selected resources (models, seeds, snapshots, tests) | All [supported versions](/docs/dbt-versions/core) | -| [clean](/reference/commands/clean) | Deletes artifacts present in the dbt project | All [supported versions](/docs/dbt-versions/core) | -| [clone](/reference/commands/clone) | Clone selected models from the specified state | Requires [dbt v1.6 or higher](/docs/dbt-versions/core) | -| [compile](/reference/commands/compile) | Compiles (but does not run) the models in a project | All [supported versions](/docs/dbt-versions/core) | -| [debug](/reference/commands/debug) | Debugs dbt connections and projects | All [supported versions](/docs/dbt-versions/core) | -| [deps](/reference/commands/deps) | Downloads dependencies for a project | All [supported versions](/docs/dbt-versions/core) | -| [docs](/reference/commands/cmd-docs) | Generates documentation for a project | All [supported versions](/docs/dbt-versions/core) | -| [list](/reference/commands/list) | Lists resources defined in a dbt project | All [supported versions](/docs/dbt-versions/core) | -| [parse](/reference/commands/parse) | Parses a project and writes detailed timing info | All [supported versions](/docs/dbt-versions/core) | -| [retry](/reference/commands/retry) | Retry the last run `dbt` command from the point of failure | Requires [dbt v1.6 or higher](/docs/dbt-versions/core) | -| [run](/reference/commands/run) | Runs the models in a project | All [supported versions](/docs/dbt-versions/core) | -| [run-operation](/reference/commands/run-operation) | Invoke a macro, including running arbitrary maintenance SQL against
the database | All [supported versions](/docs/dbt-versions/core) | -| [seed](/reference/commands/seed) | Loads CSV files into the database | All [supported versions](/docs/dbt-versions/core) | -| [show](/reference/commands/show) | Preview table rows post-transformation | All [supported versions](/docs/dbt-versions/core) | -| [snapshot](/reference/commands/snapshot) | Executes "snapshot" jobs defined in a project | All [supported versions](/docs/dbt-versions/core) | -| [source](/reference/commands/source) | Provides tools for working with source data (including validating that
sources are "fresh") | All [supported versions](/docs/dbt-versions/core) | -| [test](/reference/commands/test) | Executes tests defined in a project | All [supported versions](/docs/dbt-versions/core) | -| [init](/reference/commands/init) | Initializes a new dbt project (CLI only) | All [supported versions](/docs/dbt-versions/core) | + + +All commands in the table are compatible with either the dbt Cloud IDE, dbt Cloud CLI, or dbt Core. + +You can run dbt commands in your specific tool by prefixing them with `dbt`. For example, to run the `test` command, type `dbt test`. + +| Command | Description | Compatible tools | Version | +| ------- | ----------- | ---------------- | ------- | +| [build](/reference/commands/build) | Build and test all selected resources (models, seeds, snapshots, tests) | All | All [supported versions](/docs/dbt-versions/core) | +| cancel | Cancels the most recent invocation.| dbt Cloud CLI | Requires [dbt v1.6 or higher](/docs/dbt-versions/core) | +| [clean](/reference/commands/clean) | Deletes artifacts present in the dbt project | All | All [supported versions](/docs/dbt-versions/core) | +| [clone](/reference/commands/clone) | Clone selected models from the specified state | dbt Cloud CLI
dbt Core | Requires [dbt v1.6 or higher](/docs/dbt-versions/core) | +| [compile](/reference/commands/compile) | Compiles (but does not run) the models in a project | All | All [supported versions](/docs/dbt-versions/core) | +| [debug](/reference/commands/debug) | Debugs dbt connections and projects | dbt Core | All [supported versions](/docs/dbt-versions/core) | +| [deps](/reference/commands/deps) | Downloads dependencies for a project | All | All [supported versions](/docs/dbt-versions/core) | +| [docs](/reference/commands/cmd-docs) | Generates documentation for a project | All | All [supported versions](/docs/dbt-versions/core) | +| help | Displays help information for any command | dbt Core
dbt Cloud CLI | All [supported versions](/docs/dbt-versions/core) | +| [init](/reference/commands/init) | Initializes a new dbt project | dbt Core | All [supported versions](/docs/dbt-versions/core) | +| [list](/reference/commands/list) | Lists resources defined in a dbt project | All | All [supported versions](/docs/dbt-versions/core) | +| [parse](/reference/commands/parse) | Parses a project and writes detailed timing info | All | All [supported versions](/docs/dbt-versions/core) | +| reattach | Reattaches to the most recent invocation to retrieve logs and artifacts. | dbt Cloud CLI | Requires [dbt v1.6 or higher](/docs/dbt-versions/core) | +| [retry](/reference/commands/retry) | Retry the last run `dbt` command from the point of failure | All | Requires [dbt v1.6 or higher](/docs/dbt-versions/core) | +| [run](/reference/commands/run) | Runs the models in a project | All | All [supported versions](/docs/dbt-versions/core) | +| [run-operation](/reference/commands/run-operation) | Invoke a macro, including running arbitrary maintenance SQL against the database | All | All [supported versions](/docs/dbt-versions/core) | +| [seed](/reference/commands/seed) | Loads CSV files into the database | All | All [supported versions](/docs/dbt-versions/core) | +| [show](/reference/commands/show) | Preview table rows post-transformation | All | All [supported versions](/docs/dbt-versions/core) | +| [snapshot](/reference/commands/snapshot) | Executes "snapshot" jobs defined in a project | All | All [supported versions](/docs/dbt-versions/core) | +| [source](/reference/commands/source) | Provides tools for working with source data (including validating that sources are "fresh") | All | All [supported versions](/docs/dbt-versions/core) | +| [test](/reference/commands/test) | Executes tests defined in a project | All | All [supported versions](/docs/dbt-versions/core) |
- + Select the tabs that are relevant to your development workflow. For example, if you develop in the dbt Cloud IDE, select **dbt Cloud**. - + Use the following dbt commands in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) and use the `dbt` prefix. For example, to run the `test` command, type `dbt test`. - [build](/reference/commands/build): build and test all selected resources (models, seeds, snapshots, tests) -- [clone](/reference/commands/clone): clone selected nodes from specified state (requires dbt 1.6 or higher) +- [clone](/reference/commands/clone): clone selected nodes from the specified state (requires dbt 1.6 or higher) - [compile](/reference/commands/compile): compiles (but does not run) the models in a project - [deps](/reference/commands/deps): downloads dependencies for a project - [docs](/reference/commands/cmd-docs) : generates documentation for a project @@ -64,13 +69,13 @@ Use the following dbt commands in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/ - + -Use the following dbt commands in the [CLI](/docs/core/about-the-cli) and use the `dbt` prefix. For example, to run the `test` command, type `dbt test`. +Use the following dbt commands in [dbt Core](/docs/core/about-dbt-core) and use the `dbt` prefix. For example, to run the `test` command, type `dbt test`. - [build](/reference/commands/build): build and test all selected resources (models, seeds, snapshots, tests) - [clean](/reference/commands/clean): deletes artifacts present in the dbt project -- [clone](/reference/commands/clone): clone selected models from specified state (requires dbt 1.6 or higher) +- [clone](/reference/commands/clone): clone selected models from the specified state (requires dbt 1.6 or higher) - [compile](/reference/commands/compile): compiles (but does not run) the models in a project - [debug](/reference/commands/debug): debugs dbt connections and projects - [deps](/reference/commands/deps): downloads dependencies for a project diff --git a/website/docs/reference/dbt-jinja-functions/model.md b/website/docs/reference/dbt-jinja-functions/model.md index e967debd01f..9ccf0759470 100644 --- a/website/docs/reference/dbt-jinja-functions/model.md +++ b/website/docs/reference/dbt-jinja-functions/model.md @@ -52,15 +52,9 @@ To view the structure of `models` and their definitions: Use the following table to understand how the versioning pattern works and match the Manifest version with the dbt version: -| dbt version | Manifest version | -| ----------- | ---------------- | -| `v1.5` | [Manifest v9](https://schemas.getdbt.com/dbt/manifest/v9/index.html) -| `v1.4` | [Manifest v8](https://schemas.getdbt.com/dbt/manifest/v8/index.html) -| `v1.3` | [Manifest v7](https://schemas.getdbt.com/dbt/manifest/v7/index.html) -| `v1.2` | [Manifest v6](https://schemas.getdbt.com/dbt/manifest/v6/index.html) -| `v1.1` | [Manifest v5](https://schemas.getdbt.com/dbt/manifest/v5/index.html) - +import ManifestVersions from '/snippets/_manifest-versions.md'; + ## Related docs diff --git a/website/docs/reference/dbt-jinja-functions/target.md b/website/docs/reference/dbt-jinja-functions/target.md index 7d6627c5a4b..e7d08db592f 100644 --- a/website/docs/reference/dbt-jinja-functions/target.md +++ b/website/docs/reference/dbt-jinja-functions/target.md @@ -7,7 +7,7 @@ description: "Contains information about your connection to the warehouse." `target` contains information about your connection to the warehouse. -* **dbt CLI:** These values are based on the target defined in your [`profiles.yml` file](/docs/core/connect-data-platform/profiles.yml) +* **dbt Core:** These values are based on the target defined in your [`profiles.yml` file](/docs/core/connect-data-platform/profiles.yml) * **dbt Cloud Scheduler:** * `target.name` is defined per job as described [here](/docs/build/custom-target-names). * For all other attributes, the values are defined by the deployment connection. To check these values, click **Deploy** from the upper left and select **Environments**. Then, select the relevant deployment environment, and click **Settings**. diff --git a/website/docs/reference/dbt_project.yml.md b/website/docs/reference/dbt_project.yml.md index c706b57a73b..9bd85d0d5dd 100644 --- a/website/docs/reference/dbt_project.yml.md +++ b/website/docs/reference/dbt_project.yml.md @@ -11,6 +11,8 @@ By default, dbt will look for `dbt_project.yml` in your current working director By default, dbt will look for `dbt_project.yml` in your current working directory and its parents, but you can set a different directory using the `--project-dir` flag or the `DBT_PROJECT_DIR` environment variable. +Starting from dbt v1.5 and higher, you can specify your dbt Cloud project ID in the `dbt_project.yml` file using `project-id` under the `dbt-cloud` config. To find your project ID, check your dbt Cloud project URL, such as `https://cloud.getdbt.com/11/projects/123456`, where the project ID is `123456`. + The following is a list of all available configurations in the `dbt_project.yml` file. @@ -19,6 +21,81 @@ The following is a list of all available configurations in the `dbt_project.yml` dbt uses YAML in a few different places. If you're new to YAML, it would be worth taking the time to learn how arrays, dictionaries and strings are represented. ::: + + + + + +```yml +[name](/reference/project-configs/name): string + +[config-version](/reference/project-configs/config-version): 2 +[version](/reference/project-configs/version): version + +[profile](/reference/project-configs/profile): profilename + +[model-paths](/reference/project-configs/model-paths): [directorypath] +[seed-paths](/reference/project-configs/seed-paths): [directorypath] +[test-paths](/reference/project-configs/test-paths): [directorypath] +[analysis-paths](/reference/project-configs/analysis-paths): [directorypath] +[macro-paths](/reference/project-configs/macro-paths): [directorypath] +[snapshot-paths](/reference/project-configs/snapshot-paths): [directorypath] +[docs-paths](/reference/project-configs/docs-paths): [directorypath] +[asset-paths](/reference/project-configs/asset-paths): [directorypath] + +[target-path](/reference/project-configs/target-path): directorypath +[log-path](/reference/project-configs/log-path): directorypath +[packages-install-path](/reference/project-configs/packages-install-path): directorypath + +[clean-targets](/reference/project-configs/clean-targets): [directorypath] + +[query-comment](/reference/project-configs/query-comment): string + +[require-dbt-version](/reference/project-configs/require-dbt-version): version-range | [version-range] + +[dbt-cloud](/docs/cloud/cloud-cli-installation): + [project-id](/docs/cloud/configure-cloud-cli#configure-the-dbt-cloud-cli): project_id # Required + [defer-env-id](/docs/cloud/about-cloud-develop-defer#defer-in-dbt-cloud-cli): environment_id # Optional + +[quoting](/reference/project-configs/quoting): + database: true | false + schema: true | false + identifier: true | false + +models: + [](/reference/model-configs) + +seeds: + [](/reference/seed-configs) + +snapshots: + [](/reference/snapshot-configs) + +sources: + [](source-configs) + +tests: + [](/reference/test-configs) + +vars: + [](/docs/build/project-variables) + +[on-run-start](/reference/project-configs/on-run-start-on-run-end): sql-statement | [sql-statement] +[on-run-end](/reference/project-configs/on-run-start-on-run-end): sql-statement | [sql-statement] + +[dispatch](/reference/project-configs/dispatch-config): + - macro_namespace: packagename + search_order: [packagename] + +[restrict-access](/docs/collaborate/govern/model-access): true | false + +``` + + + + + + ```yml @@ -79,6 +156,9 @@ vars: search_order: [packagename] [restrict-access](/docs/collaborate/govern/model-access): true | false + ``` + + diff --git a/website/docs/reference/global-configs/about-global-configs.md b/website/docs/reference/global-configs/about-global-configs.md index 42819cdac8f..9d1691812b5 100644 --- a/website/docs/reference/global-configs/about-global-configs.md +++ b/website/docs/reference/global-configs/about-global-configs.md @@ -8,4 +8,11 @@ Global configs enable you to fine-tune _how_ dbt runs projects on your machine Global configs control things like the visual output of logs, the manner in which dbt parses your project, and what to do when dbt finds a version mismatch or a failing model. These configs are "global" because they are available for all dbt commands, and because they can be set for all projects running on the same machine or in the same environment. -Starting in v1.0, you can set global configs in three places. When all three are set, command line flags take precedence, then environment variables, and last yaml configs (usually `profiles.yml`). \ No newline at end of file +### Global config precedence + +Starting in v1.0, you can set global configs in three places. dbt will evaluate the configs in the following order: +1. [user config](https://docs.getdbt.com/reference/global-configs/yaml-configurations) +1. [environment variable](https://docs.getdbt.com/reference/global-configs/environment-variable-configs) +1. [CLI flag](https://docs.getdbt.com/reference/global-configs/command-line-flags) + +Each config is prioritized over the previous one. For example, if all three are provided, then the CLI flag takes precedence. diff --git a/website/docs/reference/global-configs/command-line-flags.md b/website/docs/reference/global-configs/command-line-flags.md index 6496c92da6d..fbe89ce28f1 100644 --- a/website/docs/reference/global-configs/command-line-flags.md +++ b/website/docs/reference/global-configs/command-line-flags.md @@ -4,60 +4,95 @@ id: "command-line-flags" sidebar: "Command line flags" --- -Command line (CLI) flags immediately follow `dbt` and precede your subcommand. When set, CLI flags override environment variables and profile configs. +For consistency, command-line interface (CLI) flags should come right after the `dbt` prefix and its subcommands. This includes "global" flags (supported for all commands). When set, CLI flags override environment variables and profile configs. -Use this non-boolean config structure, replacing `` with the config you are enabling or disabling, `` with the new setting for the config, and `` with the command this config applies to: +For example, instead of using: + +```bash +dbt --no-populate-cache run +``` + +You should use: + +```bash +dbt run --no-populate-cache +``` + +Historically, passing flags (such as "global flags") _before_ the subcommand is a legacy functionality that dbt Labs can remove at any time. We do not support using the same flag before and after the subcommand. + +## Using boolean and non-boolean flags + +You can construct your commands with boolean flags to enable or disable or with non-boolean flags that use specific values, such as strings. + + + + + +Use this non-boolean config structure: +- Replacing `` with the command this config applies to. +- `` with the config you are enabling or disabling, and +- `` with the new setting for the config. ```text -$ --= + --= ``` -Non-boolean config examples: +### Example ```text -dbt --printer-width=80 run -dbt --indirect-selection=eager test +dbt run --printer-width=80 +dbt test --indirect-selection=eager ``` -To turn on boolean configs, you would use the `--` CLI flag, and a `--no-` CLI flag to turn off boolean configs, replacing `` with the config you are enabling or disabling and `` with the command this config applies to. + + + + +To enable or disable boolean configs: +- Use `` this config applies to. +- Followed by `--` to turn it on, or `--no-` to turn it off. +- Replace `` with the config you are enabling or disabling -Boolean config structure: ```text -dbt -- -dbt --no- +dbt -- +dbt --no- ``` -Boolean config example: +### Example ```text -dbt --version-check run -dbt --no-version-check run +dbt run --version-check +dbt run --no-version-check ``` - \ No newline at end of file + + + + + diff --git a/website/docs/reference/model-properties.md b/website/docs/reference/model-properties.md index 730432c88af..63adc1f0d63 100644 --- a/website/docs/reference/model-properties.md +++ b/website/docs/reference/model-properties.md @@ -18,7 +18,7 @@ models: show: true | false [latest_version](/reference/resource-properties/latest_version): [deprecation_date](/reference/resource-properties/deprecation_date): - [access](/reference/resource-properties/access): private | protected | public + [access](/reference/resource-configs/access): private | protected | public [config](/reference/resource-properties/config): [](/reference/model-configs): [constraints](/reference/resource-properties/constraints): @@ -46,7 +46,7 @@ models: [description](/reference/resource-properties/description): [docs](/reference/resource-configs/docs): show: true | false - [access](/reference/resource-properties/access): private | protected | public + [access](/reference/resource-configs/access): private | protected | public [constraints](/reference/resource-properties/constraints): - [config](/reference/resource-properties/config): diff --git a/website/docs/reference/node-selection/defer.md b/website/docs/reference/node-selection/defer.md index e13a4f6648a..03c3b2aac12 100644 --- a/website/docs/reference/node-selection/defer.md +++ b/website/docs/reference/node-selection/defer.md @@ -17,16 +17,16 @@ It is possible to use separate state for `state:modified` and `--defer`, by pass ### Usage ```shell -$ dbt run --select [...] --defer --state path/to/artifacts -$ dbt test --select [...] --defer --state path/to/artifacts +dbt run --select [...] --defer --state path/to/artifacts +dbt test --select [...] --defer --state path/to/artifacts ``` ```shell -$ dbt run --models [...] --defer --state path/to/artifacts -$ dbt test --models [...] --defer --state path/to/artifacts +dbt run --models [...] --defer --state path/to/artifacts +dbt test --models [...] --defer --state path/to/artifacts ``` @@ -101,7 +101,7 @@ I want to test my changes. Nothing exists in my development schema, `dev_alice`. ```shell -$ dbt run --select model_b +dbt run --select "model_b" ``` @@ -128,7 +128,7 @@ Unless I had previously run `model_a` into this development environment, `dev_al ```shell -$ dbt run --select model_b --defer --state prod-run-artifacts +dbt run --select "model_b" --defer --state prod-run-artifacts ``` @@ -186,7 +186,7 @@ models: ```shell -dbt test --select model_b +dbt test --select "model_b" ``` @@ -211,7 +211,7 @@ The `relationships` test requires both `model_a` and `model_b`. Because I did no ```shell -dbt test --select model_b --defer --state prod-run-artifacts +dbt test --select "model_b" --defer --state prod-run-artifacts ``` diff --git a/website/docs/reference/node-selection/exclude.md b/website/docs/reference/node-selection/exclude.md index 9ad4bd1cc0e..d2c140d1bb5 100644 --- a/website/docs/reference/node-selection/exclude.md +++ b/website/docs/reference/node-selection/exclude.md @@ -7,19 +7,19 @@ sidebar_label: "Exclude" dbt provides an `--exclude` flag with the same semantics as `--select`. Models specified with the `--exclude` flag will be removed from the set of models selected with `--select`. ```bash -$ dbt run --select my_package.*+ --exclude my_package.a_big_model+ # select all models in my_package and their children except a_big_model and its children +dbt run --select "my_package".*+ --exclude "my_package.a_big_model+" # select all models in my_package and their children except a_big_model and its children ``` Exclude a specific resource by its name or lineage: ```bash # test -$ dbt test --exclude not_null_orders_order_id # test all models except the not_null_orders_order_id test -$ dbt test --exclude orders # test all models except tests associated with the orders model +dbt test --exclude "not_null_orders_order_id" # test all models except the not_null_orders_order_id test +dbt test --exclude "orders" # test all models except tests associated with the orders model # seed -$ dbt seed --exclude account_parent_mappings # load all seeds except account_parent_mappings +dbt seed --exclude "account_parent_mappings" # load all seeds except account_parent_mappings # snapshot -$ dbt snapshot --exclude snap_order_statuses # execute all snapshots except snap_order_statuses +dbt snapshot --exclude "snap_order_statuses" # execute all snapshots except snap_order_statuses ``` diff --git a/website/docs/reference/node-selection/graph-operators.md b/website/docs/reference/node-selection/graph-operators.md index 4fdc2f10628..8cba43e1b52 100644 --- a/website/docs/reference/node-selection/graph-operators.md +++ b/website/docs/reference/node-selection/graph-operators.md @@ -7,9 +7,9 @@ If placed at the front of the model selector, `+` will select all parents of the ```bash - $ dbt run --select my_model+ # select my_model and all children - $ dbt run --select +my_model # select my_model and all parents - $ dbt run --select +my_model+ # select my_model, and all of its parents and children +dbt run --select "my_model+" # select my_model and all children +dbt run --select "+my_model" # select my_model and all parents +dbt run --select "+my_model+" # select my_model, and all of its parents and children ``` @@ -20,9 +20,9 @@ to step through. ```bash - $ dbt run --select my_model+1 # select my_model and its first-degree children - $ dbt run --select 2+my_model # select my_model, its first-degree parents, and its second-degree parents ("grandparents") - $ dbt run --select 3+my_model+4 # select my_model, its parents up to the 3rd degree, and its children down to the 4th degree +dbt run --select "my_model+1" # select my_model and its first-degree children +dbt run --select "2+my_model" # select my_model, its first-degree parents, and its second-degree parents ("grandparents") +dbt run --select "3+my_model+4" # select my_model, its parents up to the 3rd degree, and its children down to the 4th degree ``` @@ -32,5 +32,5 @@ The `@` operator is similar to `+`, but will also include _the parents of the ch ```bash -$ dbt run --models @my_model # select my_model, its children, and the parents of its children +dbt run --models @my_model # select my_model, its children, and the parents of its children ``` diff --git a/website/docs/reference/node-selection/methods.md b/website/docs/reference/node-selection/methods.md index 2647f3416a3..e29612e3401 100644 --- a/website/docs/reference/node-selection/methods.md +++ b/website/docs/reference/node-selection/methods.md @@ -34,8 +34,8 @@ The `tag:` method is used to select models that match a specified [tag](/referen ```bash - $ dbt run --select tag:nightly # run all models with the `nightly` tag - ``` +dbt run --select "tag:nightly" # run all models with the `nightly` tag +``` ### The "source" method @@ -43,22 +43,22 @@ The `source` method is used to select models that select from a specified [sourc ```bash - $ dbt run --select source:snowplow+ # run all models that select from Snowplow sources - ``` +dbt run --select "source:snowplow+" # run all models that select from Snowplow sources +``` ### The "resource_type" method Use the `resource_type` method to select nodes of a particular type (`model`, `test`, `exposure`, and so on). This is similar to the `--resource-type` flag used by the [`dbt ls` command](/reference/commands/list). ```bash - $ dbt build --select resource_type:exposure # build all resources upstream of exposures - $ dbt list --select resource_type:test # list all tests in your project - ``` +dbt build --select "resource_type:exposure" # build all resources upstream of exposures +dbt list --select "resource_type:test" # list all tests in your project +``` Note: This method doesn't work for sources, so use the [`--resource-type`](/reference/commands/list) option of the list command instead: ```bash - $ dbt list --resource-type source - ``` +dbt list --resource-type source +``` ### The "path" method The `path` method is used to select models/sources defined at or under a specific path. @@ -69,12 +69,12 @@ selectors unambiguous. ```bash # These two selectors are equivalent - dbt run --select path:models/staging/github - dbt run --select models/staging/github + dbt run --select "path:models/staging/github" + dbt run --select "models/staging/github" # These two selectors are equivalent - dbt run --select path:models/staging/github/stg_issues.sql - dbt run --select models/staging/github/stg_issues.sql + dbt run --select "path:models/staging/github/stg_issues.sql" + dbt run --select "models/staging/github/stg_issues.sql" ``` @@ -85,9 +85,9 @@ The `file` method can be used to select a model by its filename, including the f ```bash # These are equivalent -dbt run --select file:some_model.sql -dbt run --select some_model.sql -dbt run --select some_model +dbt run --select "file:some_model.sql" +dbt run --select "some_model.sql" +dbt run --select "some_model" ``` @@ -96,10 +96,10 @@ dbt run --select some_model The `fqn` method is used to select nodes based off their "fully qualified names" (FQN) within the dbt graph. The default output of [`dbt list`](/reference/commands/list) is a listing of FQN. -``` -dbt run --select fqn:some_model -dbt run --select fqn:your_project.some_model -dbt run --select fqn:some_package.some_other_model +```bash +dbt run --select "fqn:some_model" +dbt run --select "fqn:your_project.some_model" +dbt run --select "fqn:some_package.some_other_model" ``` ### The "package" method @@ -111,10 +111,10 @@ selectors unambiguous. ```bash # These three selectors are equivalent - dbt run --select package:snowplow - dbt run --select snowplow - dbt run --select snowplow.* - ``` + dbt run --select "package:snowplow" + dbt run --select "snowplow" + dbt run --select "snowplow.*" +``` ### The "config" method @@ -124,10 +124,10 @@ The `config` method is used to select models that match a specified [node config ```bash - $ dbt run --select config.materialized:incremental # run all models that are materialized incrementally - $ dbt run --select config.schema:audit # run all models that are created in the `audit` schema - $ dbt run --select config.cluster_by:geo_country # run all models clustered by `geo_country` - ``` +dbt run --select "config.materialized:incremental" # run all models that are materialized incrementally +dbt run --select "config.schema:audit" # run all models that are created in the `audit` schema +dbt run --select "config.cluster_by:geo_country" # run all models clustered by `geo_country` +``` @@ -135,7 +135,8 @@ The `config` method is used to select models that match a specified [node config While most config values are strings, you can also use the `config` method to match boolean configs, dictionary keys, and values in lists. For example, given a model with the following configurations: -``` + +```bash {{ config( materialized = 'incremental', unique_key = ['column_a', 'column_b'], @@ -148,10 +149,10 @@ select ... You can select using any of the following: ```bash -$ dbt ls -s config.materialized:incremental -$ dbt ls -s config.unique_key:column_a -$ dbt ls -s config.grants.select:reporter -$ dbt ls -s config.transient:true +dbt ls -s config.materialized:incremental +dbt ls -s config.unique_key:column_a +dbt ls -s config.grants.select:reporter +dbt ls -s config.transient:true ``` @@ -162,10 +163,10 @@ The `test_type` method is used to select tests based on their type, `singular` o - ```bash - $ dbt test --select test_type:generic # run all generic tests - $ dbt test --select test_type:singular # run all singular tests - ``` +```bash +dbt test --select "test_type:generic" # run all generic tests +dbt test --select "test_type:singular" # run all singular tests +``` ### The "test_name" method @@ -176,10 +177,10 @@ that defines it. For more information about how generic tests are defined, read ```bash - $ dbt test --select test_name:unique # run all instances of the `unique` test - $ dbt test --select test_name:equality # run all instances of the `dbt_utils.equality` test - $ dbt test --select test_name:range_min_max # run all instances of a custom schema test defined in the local project, `range_min_max` - ``` +dbt test --select "test_name:unique" # run all instances of the `unique` test +dbt test --select "test_name:equality" # run all instances of the `dbt_utils.equality` test +dbt test --select "test_name:range_min_max" # run all instances of a custom schema test defined in the local project, `range_min_max` +``` ### The "state" method @@ -204,9 +205,9 @@ The `state` method is used to select nodes by comparing them against a previous ```bash - $ dbt test --select state:new # run all tests on new models + and new tests on old models - $ dbt run --select state:modified # run all models that have been modified - $ dbt ls --select state:modified # list all modified nodes (not just models) +dbt test --select "state:new " # run all tests on new models + and new tests on old models +dbt run --select "state:modified" # run all models that have been modified +dbt ls --select "state:modified" # list all modified nodes (not just models) ``` @@ -236,18 +237,18 @@ The `exposure` method is used to select parent resources of a specified [exposur ```bash - $ dbt run --select +exposure:weekly_kpis # run all models that feed into the weekly_kpis exposure - $ dbt test --select +exposure:* # test all resources upstream of all exposures - $ dbt ls --select +exposure:* --resource-type source # list all sources upstream of all exposures - ``` +dbt run --select "+exposure:weekly_kpis" # run all models that feed into the weekly_kpis exposure +dbt test --select "+exposure:*" # test all resources upstream of all exposures +dbt ls --select "+exposure:*" --resource-type source # list all sources upstream of all exposures +``` ### The "metric" method The `metric` method is used to select parent resources of a specified [metric](/docs/build/metrics). Use in conjunction with the `+` operator. ```bash -$ dbt build --select +metric:weekly_active_users # build all resources upstream of weekly_active_users metric -$ dbt ls --select +metric:* --resource-type source # list all source tables upstream of all metrics +dbt build --select "+metric:weekly_active_users" # build all resources upstream of weekly_active_users metric +dbt ls --select "+metric:*" --resource-type source # list all source tables upstream of all metrics ``` ### The "result" method @@ -255,10 +256,10 @@ $ dbt ls --select +metric:* --resource-type source # list all source tables The `result` method is related to the `state` method described above and can be used to select resources based on their result status from a prior run. Note that one of the dbt commands [`run`, `test`, `build`, `seed`] must have been performed in order to create the result on which a result selector operates. You can use `result` selectors in conjunction with the `+` operator. ```bash -$ dbt run --select result:error --state path/to/artifacts # run all models that generated errors on the prior invocation of dbt run -$ dbt test --select result:fail --state path/to/artifacts # run all tests that failed on the prior invocation of dbt test -$ dbt build --select 1+result:fail --state path/to/artifacts # run all the models associated with failed tests from the prior invocation of dbt build -$ dbt seed --select result:error --state path/to/artifacts # run all seeds that generated errors on the prior invocation of dbt seed. +dbt run --select "result:error" --state path/to/artifacts # run all models that generated errors on the prior invocation of dbt run +dbt test --select "result:fail" --state path/to/artifacts # run all tests that failed on the prior invocation of dbt test +dbt build --select "1+result:fail" --state path/to/artifacts # run all the models associated with failed tests from the prior invocation of dbt build +dbt seed --select "result:error" --state path/to/artifacts # run all seeds that generated errors on the prior invocation of dbt seed. ``` ### The "source_status" method @@ -276,8 +277,8 @@ After issuing one of the above commands, you can reference the source freshness ```bash # You can also set the DBT_ARTIFACT_STATE_PATH environment variable instead of the --state flag. -$ dbt source freshness # must be run again to compare current to previous state -$ dbt build --select source_status:fresher+ --state path/to/prod/artifacts +dbt source freshness # must be run again to compare current to previous state +dbt build --select "source_status:fresher+" --state path/to/prod/artifacts ```
@@ -286,8 +287,8 @@ $ dbt build --select source_status:fresher+ --state path/to/prod/artifacts ```bash # You can also set the DBT_STATE environment variable instead of the --state flag. -$ dbt source freshness # must be run again to compare current to previous state -$ dbt build --select source_status:fresher+ --state path/to/prod/artifacts +dbt source freshness # must be run again to compare current to previous state +dbt build --select "source_status:fresher+" --state path/to/prod/artifacts ```
@@ -305,9 +306,9 @@ Supported in v1.5 or newer. The `group` method is used to select models defined within a [group](/reference/resource-configs/group). - ```bash - dbt run --select group:finance # run all models that belong to the finance group. - ``` +```bash +dbt run --select "group:finance" # run all models that belong to the finance group. +```
@@ -321,12 +322,12 @@ Supported in v1.5 or newer. -The `access` method selects models based on their [access](/reference/resource-properties/access) property. +The `access` method selects models based on their [access](/reference/resource-configs/access) property. ```bash -dbt list --select access:public # list all public models -dbt list --select access:private # list all private models -dbt list --select access:protected # list all protected models +dbt list --select "access:public" # list all public models +dbt list --select "access:private" # list all private models +dbt list --select "access:protected" # list all protected models ``` @@ -344,11 +345,26 @@ Supported in v1.5 or newer. The `version` method selects [versioned models](/docs/collaborate/govern/model-versions) based on their [version identifier](/reference/resource-properties/versions) and [latest version](/reference/resource-properties/latest_version). ```bash -dbt list --select version:latest # only 'latest' versions -dbt list --select version:prerelease # versions newer than the 'latest' version +dbt list --select "version:latest" # only 'latest' versions +dbt list --select "version:prerelease" # versions newer than the 'latest' version dbt list --select version:old # versions older than the 'latest' version -dbt list --select version:none # models that are *not* versioned +dbt list --select "version:none" # models that are *not* versioned ``` + +### The "semantic_model" method + +Supported in v1.6 or newer. + + + +The `semantic_model` method selects [semantic models](/docs/build/semantic-models). + +```bash +dbt list --select semantic_model:* # list all semantic models +dbt list --select +semantic_model:orders # list your semantic model named "orders" and all upstream resources +``` + + \ No newline at end of file diff --git a/website/docs/reference/node-selection/putting-it-together.md b/website/docs/reference/node-selection/putting-it-together.md index 8faf02e6cc9..48fc5188b32 100644 --- a/website/docs/reference/node-selection/putting-it-together.md +++ b/website/docs/reference/node-selection/putting-it-together.md @@ -4,16 +4,16 @@ title: "Putting it together" ```bash - $ dbt run --select my_package.*+ # select all models in my_package and their children - $ dbt run --select +some_model+ # select some_model and all parents and children +dbt run --select "my_package.*+" # select all models in my_package and their children +dbt run --select "+some_model+" # select some_model and all parents and children - $ dbt run --select tag:nightly+ # select "nightly" models and all children - $ dbt run --select +tag:nightly+ # select "nightly" models and all parents and children +dbt run --select "tag:nightly+" # select "nightly" models and all children +dbt run --select "+tag:nightly+" # select "nightly" models and all parents and children - $ dbt run --select @source:snowplow # build all models that select from snowplow sources, plus their parents +dbt run --select "@source:snowplow" # build all models that select from snowplow sources, plus their parents - $ dbt test --select config.incremental_strategy:insert_overwrite,test_name:unique # execute all `unique` tests that select from models using the `insert_overwrite` incremental strategy - ``` +dbt test --select "config.incremental_strategy:insert_overwrite,test_name:unique" # execute all `unique` tests that select from models using the `insert_overwrite` incremental strategy +``` @@ -22,8 +22,8 @@ and feed exports, while _excluding_ the biggest incremental models (and one othe ```bash - $ dbt run --select @source:snowplow,tag:nightly models/export --exclude package:snowplow,config.materialized:incremental export_performance_timing - ``` +dbt run --select "@source:snowplow,tag:nightly models/export" --exclude "package:snowplow,config.materialized:incremental export_performance_timing" +``` This command selects all models that: diff --git a/website/docs/reference/node-selection/set-operators.md b/website/docs/reference/node-selection/set-operators.md index 7d6b6c2411c..af399b9cad5 100644 --- a/website/docs/reference/node-selection/set-operators.md +++ b/website/docs/reference/node-selection/set-operators.md @@ -11,7 +11,7 @@ Run snowplow_sessions, all ancestors of snowplow_sessions, fct_orders, and all a ```bash - $ dbt run --select +snowplow_sessions +fct_orders +dbt run --select "+snowplow_sessions +fct_orders" ``` ### Intersections @@ -22,15 +22,15 @@ Run all the common ancestors of snowplow_sessions and fct_orders: ```bash - $ dbt run --select +snowplow_sessions,+fct_orders - ``` +dbt run --select "+snowplow_sessions,+fct_orders" +``` Run all the common descendents of stg_invoices and stg_accounts: ```bash - $ dbt run --select stg_invoices+,stg_accounts+ +dbt run --select "stg_invoices+,stg_accounts+" ``` @@ -38,5 +38,5 @@ Run models that are in the marts/finance subdirectory *and* tagged nightly: ```bash - $ dbt run --select marts.finance,tag:nightly - ``` +dbt run --select "marts.finance,tag:nightly" +``` diff --git a/website/docs/reference/node-selection/state-comparison-caveats.md b/website/docs/reference/node-selection/state-comparison-caveats.md index baeeb7e4c75..73947c80a66 100644 --- a/website/docs/reference/node-selection/state-comparison-caveats.md +++ b/website/docs/reference/node-selection/state-comparison-caveats.md @@ -27,8 +27,8 @@ The command `dbt test -s state:modified` will include both: As long as you're adding or changing tests at the same time that you're adding or changing the resources (models, seeds, snapshots) they select from, all should work the way you expect with "simple" state selection: ```shell -$ dbt run -s state:modified -$ dbt test -s state:modified +dbt run -s "state:modified" +dbt test -s "state:modified" ``` This can get complicated, however. If you add a new test without modifying its underlying model, or add a test that selects from a new model and an old unmodified one, you may need to test a model without having first run it. @@ -36,8 +36,8 @@ This can get complicated, however. If you add a new test without modifying its u In v0.18.0, you needed to handle this by building the unmodified models needed for modified tests: ```shell -$ dbt run -s state:modified @state:modified,1+test_type:data -$ dbt test -s state:modified +dbt run -s "state:modified @state:modified,1+test_type:data" +dbt test -s "state:modified" ``` In v0.19.0, dbt added support for deferring upstream references when testing. If a test selects from a model that doesn't exist as a database object in your current environment, dbt will look to the other environment instead—the one defined in your state manifest. This enables you to use "simple" state selection without risk of query failure, but it may have some surprising consequences for tests with multiple parents. For instance, if you have a `relationships` test that depends on one modified model and one unmodified model, the test query will select from data "across" two different environments. If you limit or sample your data in development and CI, it may not make much sense to test for referential integrity, knowing there's a good chance of mismatch. @@ -45,8 +45,8 @@ In v0.19.0, dbt added support for deferring upstream references when testing. If If you're a frequent user of `relationships` tests or data tests, or frequently find yourself adding tests without modifying their underlying models, consider tweaking the selection criteria of your CI job. For instance: ```shell -$ dbt run -s state:modified -$ dbt test -s state:modified --exclude test_name:relationships +dbt run -s "state:modified" +dbt test -s "state:modified" --exclude "test_name:relationships" ``` ### False positives @@ -58,7 +58,7 @@ State comparison works by identifying discrepancies between two manifests. Thos dbt will do its best to capture *only* changes that are the result of modifications made in development. In projects with intricate env-aware logic, dbt will err on the side of running too many models (i.e. false positives). Over the next several versions of dbt, we're working on: - iterative improvements to dbt's built-in detective abilities -- better options for more complex projects, in the form of more-specific subselectors (see [this issue](https://github.com/dbt-labs/dbt-core/issues/2704)) +- better options for more complex projects, in the form of more-specific sub-selectors (see [this issue](https://github.com/dbt-labs/dbt-core/issues/2704)) State comparison is now able to detect env-aware config in `dbt_project.yml`. For instance, this target-based config would register as a modification in v0.18.0, but in v0.19.0 it no longer will: diff --git a/website/docs/reference/node-selection/syntax.md b/website/docs/reference/node-selection/syntax.md index 7c165b0f4ff..bb2aeefd742 100644 --- a/website/docs/reference/node-selection/syntax.md +++ b/website/docs/reference/node-selection/syntax.md @@ -14,6 +14,7 @@ dbt's node selection syntax makes it possible to run only specific resources in | [compile](/reference/commands/compile) | `--select`, `--exclude`, `--selector`, `--inline` | | [freshness](/reference/commands/source) | `--select`, `--exclude`, `--selector` | | [build](/reference/commands/build) | `--select`, `--exclude`, `--selector`, `--resource-type`, `--defer` | +| [docs generate](/reference/commands/cmd-docs) | `--select`, `--exclude`, `--selector` | :::info Nodes and resources @@ -24,6 +25,8 @@ We use the terms " By default, `dbt run` executes _all_ of the models in the dependency graph; `dbt seed` creates all seeds, `dbt snapshot` performs every snapshot. The `--select` flag is used to specify a subset of nodes to execute. +To follow [POSIX standards](https://pubs.opengroup.org/onlinepubs/9699919799/basedefs/V1_chap12.html) and make things easier to understand, we recommend CLI users use quotes when passing arguments to the `--select` or `--exclude` option (including single or multiple space-delimited, or comma-delimited arguments). Not using quotes might not work reliably on all operating systems, terminals, and user interfaces. For example, `dbt run --select "my_dbt_project_name"` runs all models in your project. + ### How does selection work? 1. dbt gathers all the resources that are matched by one or more of the `--select` criteria, in the order of selection methods (e.g. `tag:`), then graph operators (e.g. `+`), then finally set operators ([unions](/reference/node-selection/set-operators#unions), [intersections](/reference/node-selection/set-operators#intersections), [exclusions](/reference/node-selection/exclude)). @@ -51,28 +54,28 @@ Examples: ```bash - $ dbt run --select my_dbt_project_name # runs all models in your project - $ dbt run --select my_dbt_model # runs a specific model - $ dbt run --select path.to.my.models # runs all models in a specific directory - $ dbt run --select my_package.some_model # run a specific model in a specific package - $ dbt run --select tag:nightly # run models with the "nightly" tag - $ dbt run --select path/to/models # run models contained in path/to/models - $ dbt run --select path/to/my_model.sql # run a specific model by its path +dbt run --select "my_dbt_project_name" # runs all models in your project +dbt run --select "my_dbt_model" # runs a specific model +dbt run --select "path.to.my.models" # runs all models in a specific directory +dbt run --select "my_package.some_model" # run a specific model in a specific package +dbt run --select "tag:nightly" # run models with the "nightly" tag +dbt run --select "path/to/models" # run models contained in path/to/models +dbt run --select "path/to/my_model.sql" # run a specific model by its path ``` dbt supports a shorthand language for defining subsets of nodes. This language uses the characters `+`, `@`, `*`, and `,`. ```bash - # multiple arguments can be provided to --select - $ dbt run --select my_first_model my_second_model +# multiple arguments can be provided to --select + dbt run --select "my_first_model my_second_model" - # these arguments can be projects, models, directory paths, tags, or sources - $ dbt run --select tag:nightly my_model finance.base.* +# these arguments can be projects, models, directory paths, tags, or sources +dbt run --select "tag:nightly my_model finance.base.*" - # use methods and intersections for more complex selectors - $ dbt run --select path:marts/finance,tag:nightly,config.materialized:table - ``` +# use methods and intersections for more complex selectors +dbt run --select "path:marts/finance,tag:nightly,config.materialized:table" +``` As your selection logic gets more complex, and becomes unwieldly to type out as command-line arguments, consider using a [yaml selector](/reference/node-selection/yaml-selectors). You can use a predefined definition with the `--selector` flag. @@ -150,7 +153,7 @@ After issuing one of the above commands, you can reference the results by adding ```bash # You can also set the DBT_ARTIFACT_STATE_PATH environment variable instead of the --state flag. -$ dbt run --select result: --defer --state path/to/prod/artifacts +dbt run --select "result: --defer --state path/to/prod/artifacts" ``` The available options depend on the resource (node) type: @@ -169,7 +172,7 @@ The available options depend on the resource (node) type: The state and result selectors can also be combined in a single invocation of dbt to capture errors from a previous run OR any new or modified models. ```bash -$ dbt run --select result:+ state:modified+ --defer --state ./ +dbt run --select "result:+ state:modified+ --defer --state ./" ``` ### Fresh rebuilds @@ -183,7 +186,7 @@ As example: ```bash # Command step order dbt source freshness -dbt build --select source_status:fresher+ +dbt build --select "source_status:fresher+" ``` @@ -202,6 +205,6 @@ After issuing one of the above commands, you can reference the source freshness ```bash # You can also set the DBT_ARTIFACT_STATE_PATH environment variable instead of the --state flag. -$ dbt source freshness # must be run again to compare current to previous state -$ dbt build --select source_status:fresher+ --state path/to/prod/artifacts +dbt source freshness # must be run again to compare current to previous state +dbt build --select "source_status:fresher+" --state path/to/prod/artifacts ``` diff --git a/website/docs/reference/node-selection/test-selection-examples.md b/website/docs/reference/node-selection/test-selection-examples.md index 52439d95d97..feb3898c230 100644 --- a/website/docs/reference/node-selection/test-selection-examples.md +++ b/website/docs/reference/node-selection/test-selection-examples.md @@ -19,14 +19,14 @@ Run generic tests only: ```bash - $ dbt test --select test_type:generic + dbt test --select "test_type:generic" ``` Run singular tests only: ```bash - $ dbt test --select test_type:singular + dbt test --select "test_type:singular" ``` In both cases, `test_type` checks a property of the test itself. These are forms of "direct" test selection. @@ -87,8 +87,8 @@ By default, a test will run when ANY parent is selected; we call this "eager" in In this mode, any test that depends on unbuilt resources will raise an error. ```shell -$ dbt test --select orders -$ dbt build --select orders +dbt test --select "orders" +dbt build --select "orders" ``` @@ -102,8 +102,10 @@ It will only include tests whose references are each within the selected nodes. Put another way, it will prevent tests from running if one or more of its parents is unselected. ```shell -$ dbt test --select orders --indirect-selection=cautious -$ dbt build --select orders --indirect-selection=cautious + +dbt test --select "orders" --indirect-selection=cautious +dbt build --select "orders" --indirect-selection=cautious + ``` @@ -122,8 +124,8 @@ By default, a test will run when ANY parent is selected; we call this "eager" in In this mode, any test that depends on unbuilt resources will raise an error. ```shell -$ dbt test --select orders -$ dbt build --select orders +dbt test --select "orders" +dbt build --select "orders" ``` @@ -137,8 +139,10 @@ It will only include tests whose references are each within the selected nodes. Put another way, it will prevent tests from running if one or more of its parents is unselected. ```shell -$ dbt test --select orders --indirect-selection=cautious -$ dbt build --select orders --indirect-selection=cautious + +dbt test --select "orders" --indirect-selection=cautious +dbt build --select "orders" --indirect-selection=cautious + ``` @@ -152,8 +156,9 @@ It will only include tests whose references are each within the selected nodes ( This is useful in the same scenarios as "cautious", but also includes when a test depends on a model **and** a direct ancestor of that model (like confirming an aggregation has the same totals as its input). ```shell -$ dbt test --select orders --indirect-selection=buildable -$ dbt build --select orders --indirect-selection=buildable +dbt test --select "orders" --indirect-selection=buildable +dbt build --select "orders" --indirect-selection=buildable + ``` @@ -172,8 +177,8 @@ By default, a test will run when ANY parent is selected; we call this "eager" in In this mode, any test that depends on unbuilt resources will raise an error. ```shell -$ dbt test --select orders -$ dbt build --select orders +dbt test --select "orders" +dbt build --select "orders" ``` @@ -187,8 +192,9 @@ It will only include tests whose references are each within the selected nodes. Put another way, it will prevent tests from running if one or more of its parents is unselected. ```shell -$ dbt test --select orders --indirect-selection=cautious -$ dbt build --select orders --indirect-selection=cautious +dbt test --select "orders" --indirect-selection=cautious +dbt build --select "orders" --indirect-selection=cautious + ``` @@ -202,8 +208,8 @@ It will only include tests whose references are each within the selected nodes ( This is useful in the same scenarios as "cautious", but also includes when a test depends on a model **and** a direct ancestor of that model (like confirming an aggregation has the same totals as its input). ```shell -$ dbt test --select orders --indirect-selection=buildable -$ dbt build --select orders --indirect-selection=buildable +dbt test --select "orders" --indirect-selection=buildable +dbt build --select "orders" --indirect-selection=buildable ``` @@ -213,8 +219,10 @@ $ dbt build --select orders --indirect-selection=buildable This mode will only include tests whose references are each within the selected nodes and will ignore all tests from attached nodes. ```shell -$ dbt test --select orders --indirect-selection=empty -$ dbt build --select orders --indirect-selection=empty + +dbt test --select "orders" --indirect-selection=empty +dbt build --select "orders" --indirect-selection=empty + ``` @@ -234,22 +242,25 @@ The following examples should feel somewhat familiar if you're used to executing ```bash # Run tests on a model (indirect selection) - $ dbt test --select customers + dbt test --select "customers" + + # Run tests on two or more specific models (indirect selection) + dbt test --select "customers orders" # Run tests on all models in the models/staging/jaffle_shop directory (indirect selection) - $ dbt test --select staging.jaffle_shop + dbt test --select "staging.jaffle_shop" # Run tests downstream of a model (note this will select those tests directly!) - $ dbt test --select stg_customers+ + dbt test --select "stg_customers+" # Run tests upstream of a model (indirect selection) - $ dbt test --select +stg_customers + dbt test --select "+stg_customers" # Run tests on all models with a particular tag (direct + indirect) - $ dbt test --select tag:my_model_tag + dbt test --select "tag:my_model_tag" # Run tests on all models with a particular materialization (indirect selection) - $ dbt test --select config.materialized:table + dbt test --select "config.materialized:table" ``` @@ -258,16 +269,20 @@ The following examples should feel somewhat familiar if you're used to executing ```bash # tests on all sources - $ dbt test --select source:* + + dbt test --select "source:*" # tests on one source - $ dbt test --select source:jaffle_shop + dbt test --select "source:jaffle_shop" + + # tests on two or more specific sources + dbt test --select "source:jaffle_shop source:raffle_bakery" # tests on one source table - $ dbt test --select source:jaffle_shop.customers + dbt test --select "source:jaffle_shop.customers" # tests on everything _except_ sources - $ dbt test --exclude source:* + dbt test --exclude "source:*" ``` ### More complex selection @@ -276,10 +291,12 @@ Through the combination of direct and indirect selection, there are many ways to ```bash - $ dbt test --select assert_total_payment_amount_is_positive # directly select the test by name - $ dbt test --select payments,test_type:singular # indirect selection, v1.2 - $ dbt test --select payments,test_type:data # indirect selection, v0.18.0 - $ dbt test --select payments --data # indirect selection, earlier versions + + dbt test --select "assert_total_payment_amount_is_positive" # directly select the test by name + dbt test --select "payments,test_type:singular" # indirect selection, v1.2 + dbt test --select "payments,test_type:data" # indirect selection, v0.18.0 + dbt test --select "payments" --data # indirect selection, earlier versions + ``` @@ -288,13 +305,14 @@ Through the combination of direct and indirect selection, there are many ways to ```bash # Run tests on all models with a particular materialization - $ dbt test --select config.materialized:table + dbt test --select "config.materialized:table" # Run tests on all seeds, which use the 'seed' materialization - $ dbt test --select config.materialized:seed + dbt test --select "config.materialized:seed" # Run tests on all snapshots, which use the 'snapshot' materialization - $ dbt test --select config.materialized:snapshot + dbt test --select "config.materialized:snapshot" + ``` Note that this functionality may change in future versions of dbt. @@ -312,8 +330,8 @@ models: - name: orders columns: - name: order_id - tests: tags: [my_column_tag] + tests: - unique ``` @@ -322,7 +340,8 @@ models: ```bash - $ dbt test --select tag:my_column_tag + dbt test --select "tag:my_column_tag" + ``` Currently, tests "inherit" tags applied to columns, sources, and source tables. They do _not_ inherit tags applied to models, seeds, or snapshots. In all likelihood, those tests would still be selected indirectly, because the tag selects its parent. This is a subtle distinction, and it may change in future versions of dbt. @@ -350,5 +369,6 @@ models: ```bash - $ dbt test --select tag:my_test_tag + dbt test --select "tag:my_test_tag" + ``` diff --git a/website/docs/reference/node-selection/yaml-selectors.md b/website/docs/reference/node-selection/yaml-selectors.md index 78342e32779..1e3f8d8d1e2 100644 --- a/website/docs/reference/node-selection/yaml-selectors.md +++ b/website/docs/reference/node-selection/yaml-selectors.md @@ -34,6 +34,7 @@ Each `definition` is comprised of one or more arguments, which can be one of the Use the `union` and `intersection` operator-equivalent keywords to organize multiple arguments. ### CLI-style + ```yml definition: 'tag:nightly' @@ -42,6 +43,7 @@ definition: This simple syntax supports use of the `+`, `@`, and `*` [graph](/reference/node-selection/graph-operators) operators, but it does not support [set](/reference/node-selection/set-operators) operators or `exclude`. ### Key-value + ```yml definition: tag: nightly @@ -317,7 +319,7 @@ selectors: Then in our job definition: ```bash -$ dbt run --selector nightly_diet_snowplow +dbt run --selector nightly_diet_snowplow ``` ## Default @@ -325,6 +327,7 @@ $ dbt run --selector nightly_diet_snowplow Selectors may define a boolean `default` property. If a selector has `default: true`, dbt will use this selector's criteria when tasks do not define their own selection criteria. Let's say we define a default selector that only selects resources defined in our root project: + ```yml selectors: - name: root_project_only @@ -338,16 +341,18 @@ selectors: ``` If I run an "unqualified" command, dbt will use the selection criteria defined in `root_project_only`—that is, dbt will only build / freshness check / generate compiled SQL for resources defined in my root project. + ``` -$ dbt build -$ dbt source freshness -$ dbt docs generate +dbt build +dbt source freshness +dbt docs generate ``` If I run a command that defines its own selection criteria (via `--select`, `--exclude`, or `--selector`), dbt will ignore the default selector and use the flag criteria instead. It will not try to combine the two. -``` -$ dbt run --select model_a -$ dbt run --exclude model_a + +```bash +dbt run --select "model_a" +dbt run --exclude model_a ``` Only one selector may set `default: true` for a given invocation; otherwise, dbt will return an error. You may use a Jinja expression to adjust the value of `default` depending on the environment, however: diff --git a/website/docs/reference/programmatic-invocations.md b/website/docs/reference/programmatic-invocations.md index 6afcd65c1bc..dfd5bae09e6 100644 --- a/website/docs/reference/programmatic-invocations.md +++ b/website/docs/reference/programmatic-invocations.md @@ -2,7 +2,7 @@ title: "Programmatic invocations" --- -In v1.5, dbt-core added support for programmatic invocations. The intent is to expose the existing dbt CLI via a Python entry point, such that top-level commands are callable from within a Python script or application. +In v1.5, dbt-core added support for programmatic invocations. The intent is to expose the existing dbt Core CLI via a Python entry point, such that top-level commands are callable from within a Python script or application. The entry point is a `dbtRunner` class, which allows you to `invoke` the same commands as on the CLI. diff --git a/website/docs/reference/references-overview.md b/website/docs/reference/references-overview.md index 16afd01607c..91a228b6c3e 100644 --- a/website/docs/reference/references-overview.md +++ b/website/docs/reference/references-overview.md @@ -4,6 +4,8 @@ id: "references-overview" sidebar_label: "About References" description: "Connect dbt to any data platform in dbt Cloud or dbt Core, using a dedicated adapter plugin" hide_table_of_contents: true +pagination_next: null +pagination_prev: null --- The References section contains reference materials for developing with dbt, which includes dbt Cloud and dbt Core. @@ -49,9 +51,27 @@ Learn how to add more configurations to your dbt project or adapter, use propert icon="computer"/> + + + + + + diff --git a/website/docs/reference/resource-configs/access.md b/website/docs/reference/resource-configs/access.md new file mode 100644 index 00000000000..da50e48d2f0 --- /dev/null +++ b/website/docs/reference/resource-configs/access.md @@ -0,0 +1,97 @@ +--- +resource_types: [models] +datatype: access +--- + + + +```yml +version: 2 + +models: + - name: model_name + access: private | protected | public +``` + + + + + +Access modifiers may be applied to models one-by-one in YAML properties. In v1.5 and v1.6, you are unable to configure `access` for multiple models at once. Upgrade to v1.7 for additional configuration options. A group or subfolder contains models with varying access levels, so when you designate a model with `access: public`, make sure you intend for this behavior. + + + + + +You can apply access modifiers in config files, including `the dbt_project.yml`, or to models one-by-one in YAML properties. Applying access configs to a subfolder modifies the default for all models in that subfolder, so make sure you intend for this behavior. When setting individual model access, a group or subfolder might contain a variety of access levels, so when you designate a model with `access: public` make sure you intend for this behavior. + +There are multiple approaches to configuring access: + +In the model configs of `dbt_project.yml``: + +```yaml +models: + - name: my_public_model + access: public # Older method, still supported + +``` +Or (but not both) + +```yaml +models: + - name: my_public_model + config: + access: public # newly supported in v1.7 + +``` + +In a subfolder: +```yaml +models: + my_project_name: + subfolder_name: + +group: + +access: private # sets default for all models in this subfolder +``` + +In the model.sql file: + +```sql +-- models/my_public_model.sql + +{{ config(access = "public") }} + +select ... +``` + + + +## Definition +The access level of the model you are declaring properties for. + +Some models (not all) are designed to be referenced through the [ref](/reference/dbt-jinja-functions/ref) function across [groups](/docs/build/groups). + +| Access | Referenceable by | +|-----------|-------------------------------| +| private | same group | +| protected | same project/package | +| public | any group, package or project | + +If you try to reference a model outside of its supported access, you will see an error: + +```shell +dbt run -s marketing_model +... +dbt.exceptions.DbtReferenceError: Parsing Error + Node model.jaffle_shop.marketing_model attempted to reference node model.jaffle_shop.finance_model, + which is not allowed because the referenced node is private to the finance group. +``` + +## Default + +By default, all models are "protected." This means that other models in the same project can reference them. + +## Related docs + +* [Model Access](/docs/collaborate/govern/model-access#groups) +* [Group configuration](/reference/resource-configs/group) diff --git a/website/docs/reference/resource-configs/bigquery-configs.md b/website/docs/reference/resource-configs/bigquery-configs.md index 89a750f47bd..ffbaa37c059 100644 --- a/website/docs/reference/resource-configs/bigquery-configs.md +++ b/website/docs/reference/resource-configs/bigquery-configs.md @@ -414,7 +414,7 @@ models: columns: - name: field policy_tags: - - 'projects//locations//taxonomies//policyTags/' + - 'projects//locations//taxonomies//policyTags/' ``` diff --git a/website/docs/reference/resource-configs/contract.md b/website/docs/reference/resource-configs/contract.md index e8ea6d82287..59cc511890b 100644 --- a/website/docs/reference/resource-configs/contract.md +++ b/website/docs/reference/resource-configs/contract.md @@ -23,11 +23,34 @@ When the `contract` configuration is enforced, dbt will ensure that your model's This is to ensure that the people querying your model downstream—both inside and outside dbt—have a predictable and consistent set of columns to use in their analyses. Even a subtle change in data type, such as from `boolean` (`true`/`false`) to `integer` (`0`/`1`), could cause queries to fail in surprising ways. + + The `data_type` defined in your YAML file must match a data type your data platform recognizes. dbt does not do any type aliasing itself. If your data platform recognizes both `int` and `integer` as corresponding to the same type, then they will return a match. -When dbt is comparing data types, it will not compare granular details such as size, precision, or scale. We don't think you should sweat the difference between `varchar(256)` and `varchar(257)`, because it doesn't really affect the experience of downstream queriers. If you need a more-precise assertion, it's always possible to accomplish by [writing or using a custom test](/guides/best-practices/writing-custom-generic-tests). + + + + +dbt uses built-in type aliasing for the `data_type` defined in your YAML. For example, you can specify `string` in your contract, and on Postgres/Redshift, dbt will convert it to `text`. If dbt doesn't recognize the `data_type` name among its known aliases, it will pass it through as-is. This is enabled by default, but you can opt-out by setting `alias_types` to `false`. + +Example for disabling: + +```yml + +models: + - name: my_model + config: + contract: + enforced: true + alias_types: false # true by default + +``` + + + +When dbt compares data types, it will not compare granular details such as size, precision, or scale. We don't think you should sweat the difference between `varchar(256)` and `varchar(257)`, because it doesn't really affect the experience of downstream queriers. You can accomplish a more-precise assertion by [writing or using a custom test](/guides/best-practices/writing-custom-generic-tests). -That said, on certain data platforms, you will need to specify a varchar size or numeric scale if you do not want it to revert to the default. This is most relevant for the `numeric` type on Snowflake, which defaults to a precision of 38 and a scale of 0 (zero digits after the decimal, such as rounded to an integer). To avoid this implicit coercion, specify your `data_type` with a nonzero scale, like `numeric(38, 6)`. +Note that you need to specify a varchar size or numeric scale, otherwise dbt relies on default values. For example, if a `numeric` type defaults to a precision of 38 and a scale of 0, then the numeric column stores 0 digits to the right of the decimal (it only stores whole numbers), which might cause it to fail contract enforcement. To avoid this implicit coercion, specify your `data_type` with a nonzero scale, like `numeric(38, 6)`. dbt Core 1.7 and higher provides a warning if you don't specify precision and scale when providing a numeric data type. ## Example @@ -47,6 +70,8 @@ models: - type: not_null - name: customer_name data_type: string + - name: non_integer + data_type: numeric(38,3) ``` diff --git a/website/docs/reference/resource-configs/delimiter.md b/website/docs/reference/resource-configs/delimiter.md new file mode 100644 index 00000000000..58d6ba8344a --- /dev/null +++ b/website/docs/reference/resource-configs/delimiter.md @@ -0,0 +1,126 @@ +--- +resource_types: [seeds] +datatype: +default_value: "," +--- + +## Definition + +You can use this optional seed configuration to customize how you separate values in a [seed](/docs/build/seeds) by providing the one-character string. + +* The delimiter defaults to a comma when not specified. +* Explicitly set the `delimiter` configuration value if you want seed files to use a different delimiter, such as "|" or ";". + +:::info New in 1.7! + +Delimiter is new functionality available beginning with dbt Core v1.7. + +::: + + +## Usage + +Specify a delimiter in your `dbt_project.yml` file to customize the global separator for all seed values: + + + +```yml +seeds: + : + +delimiter: "|" # default project delimiter for seeds will be "|" + : + +delimiter: "," # delimiter for seeds in seed_subdirectory will be "," +``` + + + + +Or use a custom delimiter to override the values for a specific seed: + + + +```yml +version: 2 + +seeds: + - name: + config: + delimiter: "|" +``` + + + +## Examples +For a project with: + +* `name: jaffle_shop` in the `dbt_project.yml` file +* `seed-paths: ["seeds"]` in the `dbt_project.yml` file + +### Use a custom delimiter to override global values + +You can set a default behavior for all seeds with an exception for one seed, `seed_a`, which uses a comma: + + + +```yml +seeds: + jaffle_shop: + +delimiter: "|" # default delimiter for seeds in jaffle_shop project will be "|" + seed_a: + +delimiter: "," # delimiter for seed_a will be "," +``` + + + +Your corresponding seed files would be formatted like this: + + + +```text +col_a|col_b|col_c +1|2|3 +4|5|6 +... +``` + + + + + +```text +name,id +luna,1 +doug,2 +... +``` + + + +Or you can configure custom behavior for one seed. The `country_codes` uses the ";" delimiter: + + + +```yml +version: 2 + +seeds: + - name: country_codes + config: + delimiter: ";" +``` + + + +The `country_codes` seed file would be formatted like this: + + + +```text +country_code;country_name +US;United States +CA;Canada +GB;United Kingdom +... +``` + + diff --git a/website/docs/reference/resource-configs/enabled.md b/website/docs/reference/resource-configs/enabled.md index b6d0961ee60..52045503088 100644 --- a/website/docs/reference/resource-configs/enabled.md +++ b/website/docs/reference/resource-configs/enabled.md @@ -15,6 +15,7 @@ default_value: true { label: 'Sources', value: 'sources', }, { label: 'Metrics', value: 'metrics', }, { label: 'Exposures', value: 'exposures', }, + { label: 'Semantic models', value: 'semantic models', }, ] }> @@ -250,10 +251,39 @@ exposures: + + + + +Support for disabling semantic models has been added in dbt Core v1.7 + + + + + + + +```yml +semantic_models: + - name: semantic_people + model: ref('people') + config: + enabled: false + +``` + + + +The `enabled` configuration can be nested under the `config` key. + + + + + ## Definition -An optional configuration for disabling models, seeds, snapshots, and tests. +An optional configuration for disabling models, seeds, snapshots, tests, and semantic models. * Default: true diff --git a/website/docs/reference/resource-configs/group.md b/website/docs/reference/resource-configs/group.md index dd73d99edff..7515d8c5789 100644 --- a/website/docs/reference/resource-configs/group.md +++ b/website/docs/reference/resource-configs/group.md @@ -16,6 +16,7 @@ This functionality is new in v1.5. { label: 'Tests', value: 'tests', }, { label: 'Analyses', value: 'analyses', }, { label: 'Metrics', value: 'metrics', }, + { label: 'Semantic models', value: 'semantic models', }, ] }> @@ -265,6 +266,43 @@ metrics: + + + + +Support for grouping semantic models has been added in dbt Core v1.7. + + + + + + + +```yml +semantic_models: + - name: model_name + group: finance + +``` + + + + + +```yml +semantic_models: + [](resource-path): + +group: finance +``` + + + +The `group` configuration can be nested under the `config` key. + + + + + ## Definition diff --git a/website/docs/reference/resource-configs/meta.md b/website/docs/reference/resource-configs/meta.md index d24c5fbaee1..65c8b5f908e 100644 --- a/website/docs/reference/resource-configs/meta.md +++ b/website/docs/reference/resource-configs/meta.md @@ -14,6 +14,8 @@ default_value: {} { label: 'Tests', value: 'tests', }, { label: 'Analyses', value: 'analyses', }, { label: 'Macros', value: 'macros', }, + { label: 'Exposures', value: 'exposures', }, + { label: 'Semantic Models', value: 'semantic models', }, ] }> @@ -172,6 +174,34 @@ exposures: + + + + +Support for grouping semantic models was added in dbt Core v1.7 + + + + + + + +```yml +semantic_models: + - name: semantic_people + model: ref('people') + config: + meta: {} + +``` +The `meta` configuration can be nusted under the `config` key. + + + + + + + ## Definition @@ -248,3 +278,19 @@ select 1 as id ``` + +### Assign owner in the dbt_project.yml as a config property + + + +```yml +models: + jaffle_shop: + materialized: table + config: + meta: + owner: "@alice" +``` + + + diff --git a/website/docs/reference/resource-configs/starrocks-configs.md b/website/docs/reference/resource-configs/starrocks-configs.md new file mode 100644 index 00000000000..093534515c6 --- /dev/null +++ b/website/docs/reference/resource-configs/starrocks-configs.md @@ -0,0 +1,116 @@ +--- +title: "Starrocks configurations" +id: "starrocks-configs" +description: "Starrocks Configurations - Read this in-depth guide to learn about configurations in dbt." +--- + +## Model Configuration + +A dbt model can be configured using the following syntax: + + + + + + + +```yaml +models: + : + materialized: table // table or view or materialized_view + keys: ['id', 'name', 'some_date'] + table_type: 'PRIMARY' // PRIMARY or DUPLICATE or UNIQUE + distributed_by: ['id'] + buckets: 3 // default 10 + partition_by: ['some_date'] + partition_by_init: ["PARTITION p1 VALUES [('1971-01-01 00:00:00'), ('1991-01-01 00:00:00')),PARTITION p1972 VALUES [('1991-01-01 00:00:00'), ('1999-01-01 00:00:00'))"] + properties: [{"replication_num":"1", "in_memory": "true"}] + refresh_method: 'async' // only for materialized view default manual +``` + + + + + + + +```yaml +models: + - name: + config: + materialized: table // table or view or materialized_view + keys: ['id', 'name', 'some_date'] + table_type: 'PRIMARY' // PRIMARY or DUPLICATE or UNIQUE + distributed_by: ['id'] + buckets: 3 // default 10 + partition_by: ['some_date'] + partition_by_init: ["PARTITION p1 VALUES [('1971-01-01 00:00:00'), ('1991-01-01 00:00:00')),PARTITION p1972 VALUES [('1991-01-01 00:00:00'), ('1999-01-01 00:00:00'))"] + properties: [{"replication_num":"1", "in_memory": "true"}] + refresh_method: 'async' // only for materialized view default manual +``` + + + + + + + +```jinja +{{ config( + materialized = 'table', + keys=['id', 'name', 'some_date'], + table_type='PRIMARY', + distributed_by=['id'], + buckets=3, + partition_by=['some_date'], + .... +) }} +``` + + + + +### Configuration Description + +| Option | Description | +|---------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +| `materialized` | How the model will be materialized into Starrocks. Supports view, table, incremental, ephemeral, and materialized_view. | +| `keys` | Which columns serve as keys. | +| `table_type` | Table type, supported are PRIMARY or DUPLICATE or UNIQUE. | +| `distributed_by` | Specifies the column of data distribution. If not specified, it defaults to random. | +| `buckets` | The bucket number in one partition. If not specified, it will be automatically inferred. | +| `partition_by` | The partition column list. | +| `partition_by_init` | The partition rule or some real partitions item. | +| `properties` | The table properties configuration of Starrocks. ([Starrocks table properties](https://docs.starrocks.io/en-us/latest/sql-reference/sql-statements/data-definition/CREATE_TABLE#properties)) | +| `refresh_method` | How to refresh materialized views. | + +## Read From Catalog +First you need to add this catalog to starrocks. The following is an example of hive. + +```sql +CREATE EXTERNAL CATALOG `hive_catalog` +PROPERTIES ( + "hive.metastore.uris" = "thrift://127.0.0.1:8087", + "type"="hive" +); +``` +How to add other types of catalogs can be found in the documentation. [Catalog Overview](https://docs.starrocks.io/en-us/latest/data_source/catalog/catalog_overview) Then write the sources.yaml file. +```yaml +sources: + - name: external_example + schema: hive_catalog.hive_db + tables: + - name: hive_table_name +``` +Finally, you might use below marco quote +```jinja +{{ source('external_example', 'hive_table_name') }} +``` \ No newline at end of file diff --git a/website/docs/reference/resource-configs/store_failures.md b/website/docs/reference/resource-configs/store_failures.md index 3c965179211..6c71cdb9296 100644 --- a/website/docs/reference/resource-configs/store_failures.md +++ b/website/docs/reference/resource-configs/store_failures.md @@ -3,7 +3,7 @@ resource_types: [tests] datatype: boolean --- -The configured test(s) will store their failures when `dbt test --store-failures` is invoked. +The configured test(s) will store their failures when `dbt test --store-failures` is invoked. If you set this configuration as `false` but [`store_failures_as`](/reference/resource-configs/store_failures_as) is configured, it will be overriden. ## Description Optionally set a test to always or never store its failures in the database. diff --git a/website/docs/reference/resource-configs/store_failures_as.md b/website/docs/reference/resource-configs/store_failures_as.md new file mode 100644 index 00000000000..a9149360089 --- /dev/null +++ b/website/docs/reference/resource-configs/store_failures_as.md @@ -0,0 +1,76 @@ +--- +resource_types: [tests] +id: "store_failures_as" +--- + +For the `test` resource type, `store_failures_as` is an optional config that specifies how test failures should be stored in the database. If [`store_failures`](/reference/resource-configs/store_failures) is also configured, `store_failures_as` takes precedence. + +The three supported values are: + +- `ephemeral` — nothing stored in the database (default) +- `table` — test failures stored as a database table +- `view` — test failures stored as a database view + +You can configure it in all the same places as `store_failures`, including singular tests (.sql files), generic tests (.yml files), and dbt_project.yml. + +### Examples + +#### Singular test + +[Singular test](https://docs.getdbt.com/docs/build/tests#singular-tests) in `tests/singular/check_something.sql` file + +```sql +{{ config(store_failures_as="table") }} + +-- custom singular test +select 1 as id +where 1=0 +``` + +#### Generic test + +[Generic tests](https://docs.getdbt.com/docs/build/tests#generic-tests) in `models/_models.yml` file + +```yaml +models: + - name: my_model + columns: + - name: id + tests: + - not_null: + config: + store_failures_as: view + - unique: + config: + store_failures_as: ephemeral +``` + +#### Project level + +Config in `dbt_project.yml` + +```yaml +name: "my_project" +version: "1.0.0" +config-version: 2 +profile: "sandcastle" + +tests: + my_project: + +store_failures_as: table + my_subfolder_1: + +store_failures_as: view + my_subfolder_2: + +store_failures_as: ephemeral +``` + +### "Clobbering" configs + +As with most other configurations, `store_failures_as` is "clobbered" when applied hierarchically. Whenever a more specific value is available, it will completely replace the less specific value. + +Additional resources: + +- [Test configurations](/reference/test-configs#related-documentation) +- [Test-specific configurations](/reference/test-configs#test-specific-configurations) +- [Configuring directories of models in dbt_project.yml](/reference/model-configs#configuring-directories-of-models-in-dbt_projectyml) +- [Config inheritance](/reference/configs-and-properties#config-inheritance) \ No newline at end of file diff --git a/website/docs/reference/resource-configs/teradata-configs.md b/website/docs/reference/resource-configs/teradata-configs.md index f0f4f1a6f3e..12a8929429d 100644 --- a/website/docs/reference/resource-configs/teradata-configs.md +++ b/website/docs/reference/resource-configs/teradata-configs.md @@ -35,14 +35,21 @@ id: "teradata-configs" ### * `table_kind` - define the table kind. Legal values are `MULTISET` (default for ANSI transaction mode required by `dbt-teradata`) and `SET`, e.g.: - ```yaml - {{ - config( - materialized="table", - table_kind="SET" - ) - }} - ``` + * in sql materialization definition file: + ```yaml + {{ + config( + materialized="table", + table_kind="SET" + ) + }} + ``` + * in seed configuration: + ```yaml + seeds: + : + table_kind: "SET" + ``` For details, see [CREATE TABLE documentation](https://docs.teradata.com/r/76g1CuvvQlYBjb2WPIuk3g/B6Js16DRQVwPDjgJ8rz7hg). * `table_option` - defines table options. The config supports multiple statements. The definition below uses the Teradata syntax definition to explain what statements are allowed. Square brackets `[]` denote optional parameters. The pipe symbol `|` separates statements. Use commas to combine multiple statements as shown in the examples below: ``` @@ -87,37 +94,51 @@ id: "teradata-configs" ``` Examples: - - :::info Separators between statements - Note the commas that separate statements in `table_option` config. - ::: - - ```yaml - {{ - config( - materialized="table", - table_option="NO FALLBACK" - ) - }} - ``` - ```yaml - {{ - config( - materialized="table", - table_option="NO FALLBACK, NO JOURNAL" - ) - }} - ``` - ```yaml - {{ - config( - materialized="table", - table_option="NO FALLBACK, NO JOURNAL, CHECKSUM = ON, - NO MERGEBLOCKRATIO, - WITH CONCURRENT ISOLATED LOADING FOR ALL" - ) - }} - ``` + * in sql materialization definition file: + ```yaml + {{ + config( + materialized="table", + table_option="NO FALLBACK" + ) + }} + ``` + ```yaml + {{ + config( + materialized="table", + table_option="NO FALLBACK, NO JOURNAL" + ) + }} + ``` + ```yaml + {{ + config( + materialized="table", + table_option="NO FALLBACK, NO JOURNAL, CHECKSUM = ON, + NO MERGEBLOCKRATIO, + WITH CONCURRENT ISOLATED LOADING FOR ALL" + ) + }} + ``` + * in seed configuration: + ```yaml + seeds: + : + table_option:"NO FALLBACK" + ``` + ```yaml + seeds: + : + table_option:"NO FALLBACK, NO JOURNAL" + ``` + ```yaml + seeds: + : + table_option: "NO FALLBACK, NO JOURNAL, CHECKSUM = ON, + NO MERGEBLOCKRATIO, + WITH CONCURRENT ISOLATED LOADING FOR ALL" + ``` For details, see [CREATE TABLE documentation](https://docs.teradata.com/r/76g1CuvvQlYBjb2WPIuk3g/B6Js16DRQVwPDjgJ8rz7hg). @@ -160,46 +181,67 @@ id: "teradata-configs" ``` Examples: - - :::info Separators between statements - Note, unlike with `table_option` statements, there are no commas between statements in `index` config. - ::: - - ```yaml - {{ - config( - materialized="table", - index="UNIQUE PRIMARY INDEX ( GlobalID )" - ) - }} - ``` - - ```yaml - {{ - config( - materialized="table", - index="PRIMARY INDEX(id) - PARTITION BY RANGE_N(create_date - BETWEEN DATE '2020-01-01' - AND DATE '2021-01-01' - EACH INTERVAL '1' MONTH)" - ) - }} - ``` - - ```yaml - {{ - config( - materialized="table", - index="PRIMARY INDEX(id) - PARTITION BY RANGE_N(create_date - BETWEEN DATE '2020-01-01' - AND DATE '2021-01-01' - EACH INTERVAL '1' MONTH) - INDEX index_attrA (attrA) WITH LOAD IDENTITY" - ) - }} - ``` + * in sql materialization definition file: + ```yaml + {{ + config( + materialized="table", + index="UNIQUE PRIMARY INDEX ( GlobalID )" + ) + }} + ``` + > :information_source: Note, unlike in `table_option`, there are no commas between index statements! + ```yaml + {{ + config( + materialized="table", + index="PRIMARY INDEX(id) + PARTITION BY RANGE_N(create_date + BETWEEN DATE '2020-01-01' + AND DATE '2021-01-01' + EACH INTERVAL '1' MONTH)" + ) + }} + ``` + ```yaml + {{ + config( + materialized="table", + index="PRIMARY INDEX(id) + PARTITION BY RANGE_N(create_date + BETWEEN DATE '2020-01-01' + AND DATE '2021-01-01' + EACH INTERVAL '1' MONTH) + INDEX index_attrA (attrA) WITH LOAD IDENTITY" + ) + }} + ``` + * in seed configuration: + ```yaml + seeds: + : + index: "UNIQUE PRIMARY INDEX ( GlobalID )" + ``` + > :information_source: Note, unlike in `table_option`, there are no commas between index statements! + ```yaml + seeds: + : + index: "PRIMARY INDEX(id) + PARTITION BY RANGE_N(create_date + BETWEEN DATE '2020-01-01' + AND DATE '2021-01-01' + EACH INTERVAL '1' MONTH)" + ``` + ```yaml + seeds: + : + index: "PRIMARY INDEX(id) + PARTITION BY RANGE_N(create_date + BETWEEN DATE '2020-01-01' + AND DATE '2021-01-01' + EACH INTERVAL '1' MONTH) + INDEX index_attrA (attrA) WITH LOAD IDENTITY" + ``` ## Seeds :::info Using seeds to load raw data @@ -220,6 +262,35 @@ Loading CSVs using dbt's seed functionality is not performant for large files. C +use_fastload: true ``` +#### Grants + +Grants are supported in dbt-teradata adapter with release version 1.2.0 and above. You can use grants to manage access to the datasets you're producing with dbt. To implement these permissions, define grants as resource configs on each model, seed, or snapshot. Define the default grants that apply to the entire project in your `dbt_project.yml`, and define model-specific grants within each model's SQL or YAML file. + +for e.g. : + models/schema.yml + ```yaml + models: + - name: model_name + config: + grants: + select: ['user_a', 'user_b'] + ``` + +Another e.g. for adding multiple grants: + + ```yaml + models: + - name: model_name + config: + materialized: table + grants: + select: ["user_b"] + insert: ["user_c"] + ``` +> :information_source: `copy_grants` is not supported in Teradata. + +More on Grants can be found at https://docs.getdbt.com/reference/resource-configs/grants + ## Common Teradata-specific tasks * *collect statistics* - when a table is created or modified significantly, there might be a need to tell Teradata to collect statistics for the optimizer. It can be done using `COLLECT STATISTICS` command. You can perform this step using dbt's `post-hooks`, e.g.: diff --git a/website/docs/reference/resource-properties/access.md b/website/docs/reference/resource-properties/access.md deleted file mode 100644 index 42b9893ed7f..00000000000 --- a/website/docs/reference/resource-properties/access.md +++ /dev/null @@ -1,53 +0,0 @@ ---- -resource_types: [models] -datatype: access -required: no ---- - -:::info New functionality -This functionality is new in v1.5. -::: - - - -```yml -version: 2 - -models: - - name: model_name - access: private | protected | public -``` - - - -Access modifiers may be applied to models one-by-one in YAML properties. It is not currently possible to configure `access` for multiple models at once. A group or subfolder contains models with a variety of access levels, and designating a model with `access: public` should always be a conscious and intentional choice. - -## Definition -The access level of the model you are declaring properties for. - -Some models (not all) are designed to be referenced through the [ref](/reference/dbt-jinja-functions/ref) function across [groups](/docs/build/groups). - -| Access | Referenceable by | -|-----------|-------------------------------| -| private | same group | -| protected | same project/package | -| public | any group, package or project | - -If you try to reference a model outside of its supported access, you will see an error: - -```shell -dbt run -s marketing_model -... -dbt.exceptions.DbtReferenceError: Parsing Error - Node model.jaffle_shop.marketing_model attempted to reference node model.jaffle_shop.finance_model, - which is not allowed because the referenced node is private to the finance group. -``` - -## Default - -By default, all models are "protected." This means that other models in the same project can reference them. - -## Related docs - -* [Model Access](/docs/collaborate/govern/model-access#groups) -* [Group configuration](/reference/resource-configs/group) diff --git a/website/docs/reference/seed-configs.md b/website/docs/reference/seed-configs.md index d74f414cbfe..429aa9444ae 100644 --- a/website/docs/reference/seed-configs.md +++ b/website/docs/reference/seed-configs.md @@ -23,6 +23,7 @@ seeds: [](/reference/resource-configs/resource-path): [+](/reference/resource-configs/plus-prefix)[quote_columns](/reference/resource-configs/quote_columns): true | false [+](/reference/resource-configs/plus-prefix)[column_types](/reference/resource-configs/column_types): {column_name: datatype} + [+](/reference/resource-configs/plus-prefix)[delimiter](/reference/resource-configs/delimiter): ``` @@ -43,6 +44,7 @@ seeds: config: [quote_columns](/reference/resource-configs/quote_columns): true | false [column_types](/reference/resource-configs/column_types): {column_name: datatype} + [delimiter](/reference/resource-configs/grants): ``` diff --git a/website/docs/reference/snowflake-permissions.md b/website/docs/reference/snowflake-permissions.md deleted file mode 100644 index 6a469d12230..00000000000 --- a/website/docs/reference/snowflake-permissions.md +++ /dev/null @@ -1,25 +0,0 @@ ---- -title: "Snowflake Permissions" ---- - -## Example Snowflake permissions - -``` --- NOTE: warehouse_name, database_name, and role_name are placeholders! --- Replace as-needed for your organization's naming convention! - -grant all on warehouse warehouse_name to role role_name; -grant usage on database database_name to role role_name; -grant create schema on database database_name to role role_name; -grant usage on schema database.an_existing_schema to role role_name; -grant create table on schema database.an_existing_schema to role role_name; -grant create view on schema database.an_existing_schema to role role_name; -grant usage on future schemas in database database_name to role role_name; -grant monitor on future schemas in database database_name to role role_name; -grant select on future tables in database database_name to role role_name; -grant select on future views in database database_name to role role_name; -grant usage on all schemas in database database_name to role role_name; -grant monitor on all schemas in database database_name to role role_name; -grant select on all tables in database database_name to role role_name; -grant select on all views in database database_name to role role_name; -``` diff --git a/website/docs/terms/materialization.md b/website/docs/terms/materialization.md index fdeaaebfcc8..328076f1483 100644 --- a/website/docs/terms/materialization.md +++ b/website/docs/terms/materialization.md @@ -11,7 +11,7 @@ hoverSnippet: The exact Data Definition Language (DDL) that dbt will use when cr :::important This page could use some love -This term would benefit from additional depth and examples. Have knowledge to contribute? [Create a discussion in the docs.getdbt.com GitHub repository](https://github.com/dbt-labs/docs.getdbt.com/discussions) to begin the process of becoming a glossary contributor! +This term would benefit from additional depth and examples. Have knowledge to contribute? [Create an issue in the docs.getdbt.com repository](https://github.com/dbt-labs/docs.getdbt.com/issues/new/choose) to begin the process of becoming a glossary contributor! ::: The exact Data Definition Language (DDL) that dbt will use when creating the model’s equivalent in a . It's the manner in which the data is represented, and each of those options is defined either canonically (tables, views, incremental), or bespoke. diff --git a/website/docs/terms/model.md b/website/docs/terms/model.md new file mode 100644 index 00000000000..c589cc196a7 --- /dev/null +++ b/website/docs/terms/model.md @@ -0,0 +1,9 @@ +--- +id: model +title: Model +description: A model is an essential building block of the DAG +displayText: model +hoverSnippet: A model is an essential building block of the DAG +--- + +A model is an essential building block of the DAG that lives in a single file and contains logic that transforms data. This logic can be expressed as a SQL `select` statement or a Python dataframe operation. Models can be materialized in the warehouse in different ways — most of these materializations require models to be built in the warehouse. \ No newline at end of file diff --git a/website/docs/terms/table.md b/website/docs/terms/table.md index 69fc2b3e6b6..cbe36ec1315 100644 --- a/website/docs/terms/table.md +++ b/website/docs/terms/table.md @@ -6,7 +6,7 @@ displayText: table hoverSnippet: In simplest terms, a table is the direct storage of data in rows and columns. Think excel sheet with raw values in each of the cells. --- :::important This page could use some love -This term would benefit from additional depth and examples. Have knowledge to contribute? [Create a discussion in the docs.getdbt.com GitHub repository](https://github.com/dbt-labs/docs.getdbt.com/discussions) to begin the process of becoming a glossary contributor! +This term would benefit from additional depth and examples. Have knowledge to contribute? [Create an issue in the docs.getdbt.com repository](https://github.com/dbt-labs/docs.getdbt.com/issues/new/choose) to begin the process of becoming a glossary contributor! ::: In simplest terms, a table is the direct storage of data in rows and columns. Think excel sheet with raw values in each of the cells. diff --git a/website/docs/terms/view.md b/website/docs/terms/view.md index 5d9238256e0..90cd5d1f36f 100644 --- a/website/docs/terms/view.md +++ b/website/docs/terms/view.md @@ -6,7 +6,7 @@ displayText: view hoverSnippet: A view (as opposed to a table) is a defined passthrough SQL query that can be run against a database (or data warehouse). --- :::important This page could use some love -This term would benefit from additional depth and examples. Have knowledge to contribute? [Create a discussion in the docs.getdbt.com GitHub repository](https://github.com/dbt-labs/docs.getdbt.com/discussions) to begin the process of becoming a glossary contributor! +This term would benefit from additional depth and examples. Have knowledge to contribute? [Create an issue in the docs.getdbt.com repository](https://github.com/dbt-labs/docs.getdbt.com/issues/new/choose) to begin the process of becoming a glossary contributor! ::: A view (as opposed to a ) is a defined passthrough SQL query that can be run against a database (or ). A view doesn’t store data, like a table does, but it defines the logic that you need to fetch the underlying data. diff --git a/website/docusaurus.config.js b/website/docusaurus.config.js index 0eae62ecec3..ce81d614c65 100644 --- a/website/docusaurus.config.js +++ b/website/docusaurus.config.js @@ -71,13 +71,13 @@ var siteSettings = { announcementBar: { id: "biweekly-demos", content: - "Register now for Coalesce 2023. The Analytics Engineering Conference!", - backgroundColor: "#7444FD", + "Join our weekly demos and dbt Cloud in action!", + backgroundColor: "#047377", textColor: "#fff", isCloseable: true, }, announcementBarActive: true, - announcementBarLink: "https://coalesce.getdbt.com/", + announcementBarLink: "https://www.getdbt.com/resources/dbt-cloud-demos-with-experts?utm_source=docs&utm_medium=event&utm_campaign=q1-2024_cloud-demos-with-experts_awareness", // Set community spotlight member on homepage // This is the ID for a specific file under docs/community/spotlight communitySpotlightMember: "faith-lierheimer", diff --git a/website/sidebars.js b/website/sidebars.js index 8b162f67af3..055dfed0e7d 100644 --- a/website/sidebars.js +++ b/website/sidebars.js @@ -7,6 +7,7 @@ const sidebarSettings = { collapsed: true, link: { type: "doc", id: "docs/supported-data-platforms" }, items: [ + "docs/supported-data-platforms", "docs/connect-adapters", "docs/verified-adapters", "docs/trusted-adapters", @@ -17,12 +18,12 @@ const sidebarSettings = { { type: "category", label: "About dbt Cloud", + link: { type: "doc", id: "docs/cloud/about-cloud/dbt-cloud-features" }, items: [ "docs/cloud/about-cloud/dbt-cloud-features", "docs/cloud/about-cloud/architecture", "docs/cloud/about-cloud/tenancy", "docs/cloud/about-cloud/regions-ip-addresses", - "docs/cloud/about-cloud/about-cloud-ide", "docs/cloud/about-cloud/browsers", ], }, // About dbt Cloud directory @@ -35,6 +36,7 @@ const sidebarSettings = { type: "category", label: "Set up dbt", collapsed: true, + link: { type: "doc", id: "docs/about-setup" }, items: [ "docs/about-setup", "docs/environments-in-dbt", @@ -42,12 +44,14 @@ const sidebarSettings = { type: "category", label: "dbt Cloud", collapsed: true, + link: { type: "doc", id: "docs/cloud/about-cloud-setup" }, items: [ "docs/cloud/about-cloud-setup", "docs/dbt-cloud-environments", { type: "category", label: "Connect data platform", + link: { type: "doc", id: "docs/cloud/connect-data-platform/about-connections" }, items: [ "docs/cloud/connect-data-platform/about-connections", "docs/cloud/connect-data-platform/connect-starburst-trino", @@ -61,13 +65,15 @@ const sidebarSettings = { { type: "category", label: "Manage access", + link: { type: "doc", id: "docs/cloud/manage-access/about-user-access" }, items: [ "docs/cloud/manage-access/about-user-access", - "docs/cloud/manage-access/seats-and-users", { type: "category", - label: "Permissions", + label: "User permissions and licenses", + link: { type: "doc", id: "docs/cloud/manage-access/seats-and-users" }, items: [ + "docs/cloud/manage-access/seats-and-users", "docs/cloud/manage-access/self-service-permissions", "docs/cloud/manage-access/enterprise-permissions", ], @@ -75,7 +81,8 @@ const sidebarSettings = { { type: "category", - label: "Single sign-on", + label: "Single sign-on and Oauth", + link: { type: "doc", id: "docs/cloud/manage-access/sso-overview" }, items: [ "docs/cloud/manage-access/sso-overview", "docs/cloud/manage-access/auth0-migration", @@ -83,16 +90,11 @@ const sidebarSettings = { "docs/cloud/manage-access/set-up-sso-okta", "docs/cloud/manage-access/set-up-sso-google-workspace", "docs/cloud/manage-access/set-up-sso-azure-active-directory", - ], - }, // SSO - { - type: "category", - label: "OAuth with data platforms", - items: [ "docs/cloud/manage-access/set-up-snowflake-oauth", + "docs/cloud/manage-access/set-up-databricks-oauth", "docs/cloud/manage-access/set-up-bigquery-oauth", ], - }, // oauth + }, // SSO "docs/cloud/manage-access/audit-log", ], }, // Manage access @@ -100,38 +102,60 @@ const sidebarSettings = { { type: "category", label: "Configure Git", + link: { type: "doc", id: "docs/cloud/git/git-configuration-in-dbt-cloud" }, items: [ + "docs/cloud/git/git-configuration-in-dbt-cloud", + "docs/cloud/git/import-a-project-by-git-url", "docs/cloud/git/connect-github", "docs/cloud/git/connect-gitlab", { type: "category", label: "Azure DevOps", + link: { type: "doc", id: "docs/cloud/git/connect-azure-devops" }, items: [ "docs/cloud/git/connect-azure-devops", "docs/cloud/git/setup-azure", "docs/cloud/git/authenticate-azure", ], }, - "docs/cloud/git/import-a-project-by-git-url", ], }, // Supported Git providers { type: "category", - label: "Develop in the IDE", - link: { - type: "doc", - id: "docs/cloud/dbt-cloud-ide/develop-in-the-cloud", - }, + label: "Develop in dbt Cloud", + link: { type: "doc", id: "docs/cloud/about-cloud-develop" }, items: [ - "docs/cloud/dbt-cloud-ide/ide-user-interface", - "docs/cloud/dbt-cloud-ide/lint-format", - "docs/cloud/dbt-cloud-ide/dbt-cloud-tips", + "docs/cloud/about-cloud-develop", + "docs/cloud/about-cloud-develop-defer", + { + type: "category", + label: "dbt Cloud CLI", + link: { type: "doc", id: "docs/cloud/cloud-cli-installation" }, + items: [ + "docs/cloud/cloud-cli-installation", + "docs/cloud/configure-cloud-cli", + ], + }, + { + type: "category", + label: "dbt Cloud IDE", + link: { type: "doc", id: "docs/cloud/dbt-cloud-ide/develop-in-the-cloud" }, + items: [ + "docs/cloud/dbt-cloud-ide/develop-in-the-cloud", + "docs/cloud/dbt-cloud-ide/ide-user-interface", + "docs/cloud/dbt-cloud-ide/lint-format", + "docs/cloud/dbt-cloud-ide/dbt-cloud-tips", + ], + }, ], - }, // dbt Cloud IDE directory + }, // dbt Cloud develop directory { type: "category", label: "Secure your tenant", + link: { type: "doc", id: "docs/cloud/secure/secure-your-tenant" }, items: [ + "docs/cloud/secure/secure-your-tenant", + "docs/cloud/secure/ip-restrictions", "docs/cloud/secure/about-privatelink", "docs/cloud/secure/snowflake-privatelink", "docs/cloud/secure/databricks-privatelink", @@ -149,13 +173,15 @@ const sidebarSettings = { collapsed: true, link: { type: "doc", id: "docs/core/about-core-setup" }, items: [ - "docs/core/about-the-cli", + "docs/core/about-core-setup", + "docs/core/about-dbt-core", "docs/core/dbt-core-environments", { type: "category", label: "Install dbt", link: { type: "doc", id: "docs/core/installation" }, items: [ + "docs/core/installation", "docs/core/homebrew-install", "docs/core/pip-install", "docs/core/docker-install", @@ -170,6 +196,7 @@ const sidebarSettings = { id: "docs/core/connect-data-platform/about-core-connections", }, items: [ + "docs/core/connect-data-platform/about-core-connections", "docs/core/connect-data-platform/profiles.yml", "docs/core/connect-data-platform/connection-profiles", "docs/core/connect-data-platform/bigquery-setup", @@ -212,6 +239,7 @@ const sidebarSettings = { "docs/core/connect-data-platform/fal-setup", "docs/core/connect-data-platform/decodable-setup", "docs/core/connect-data-platform/upsolver-setup", + "docs/core/connect-data-platform/starrocks-setup", ], }, ], @@ -224,16 +252,19 @@ const sidebarSettings = { type: "category", label: "Build dbt projects", collapsed: true, + link: { type: "doc", id: "docs/build/projects" }, items: [ "docs/build/projects", { type: "category", label: "Build your DAG", collapsed: true, + link: { type: "doc", id: "docs/build/models" }, items: [ { type: "category", label: "Models", + link: { type: "doc", id: "docs/build/models" }, items: [ "docs/build/models", "docs/build/sql-models", @@ -257,38 +288,43 @@ const sidebarSettings = { link: { type: "doc", id: "docs/build/build-metrics-intro" }, collapsed: true, items: [ + "docs/build/build-metrics-intro", "docs/build/sl-getting-started", { type: "category", label: "About MetricFlow", link: { type: "doc", id: "docs/build/about-metricflow" }, items: [ + "docs/build/about-metricflow", "docs/build/join-logic", "docs/build/validation", + "docs/build/saved-queries", "docs/build/metricflow-time-spine", - "docs/build/metricflow-cli", - ] + "docs/build/metricflow-commands", + ], }, { type: "category", label: "Semantic models", link: { type: "doc", id: "docs/build/semantic-models" }, items: [ + "docs/build/semantic-models", "docs/build/dimensions", "docs/build/entities", - "docs/build/measures" - ] + "docs/build/measures", + ], }, { type: "category", label: "Metrics", link: { type: "doc", id: "docs/build/metrics-overview" }, items: [ + "docs/build/metrics-overview", "docs/build/cumulative", "docs/build/derived", "docs/build/ratio", "docs/build/simple", - ] + ], }, ], }, @@ -296,7 +332,9 @@ const sidebarSettings = { type: "category", label: "Enhance your models", collapsed: true, + link: { type: "doc", id: "docs/build/enhance-your-models" }, items: [ + "docs/build/enhance-your-models", "docs/build/materializations", "docs/build/incremental-models", ], @@ -305,7 +343,9 @@ const sidebarSettings = { type: "category", label: "Enhance your code", collapsed: true, + link: { type: "doc", id: "docs/build/enhance-your-code" }, items: [ + "docs/build/enhance-your-code", "docs/build/project-variables", "docs/build/environment-variables", "docs/build/packages", @@ -316,7 +356,9 @@ const sidebarSettings = { type: "category", label: "Organize your outputs", collapsed: true, + link: { type: "doc", id: "docs/build/organize-your-outputs" }, items: [ + "docs/build/organize-your-outputs", "docs/build/custom-schemas", "docs/build/custom-databases", "docs/build/custom-aliases", @@ -333,6 +375,7 @@ const sidebarSettings = { collapsed: true, link: { type: "doc", id: "docs/deploy/deployments" }, items: [ + "docs/deploy/deployments", "docs/deploy/job-scheduler", "docs/deploy/deploy-environments", "docs/deploy/continuous-integration", @@ -341,6 +384,7 @@ const sidebarSettings = { label: "Jobs", link: { type: "doc", id: "docs/deploy/jobs" }, items: [ + "docs/deploy/jobs", "docs/deploy/deploy-jobs", "docs/deploy/ci-jobs", "docs/deploy/job-commands", @@ -351,7 +395,9 @@ const sidebarSettings = { label: "Monitor jobs and alerts", link: { type: "doc", id: "docs/deploy/monitor-jobs" }, items: [ + "docs/deploy/monitor-jobs", "docs/deploy/run-visibility", + "docs/deploy/retry-jobs", "docs/deploy/job-notifications", "docs/deploy/webhooks", "docs/deploy/artifacts", @@ -365,11 +411,14 @@ const sidebarSettings = { { type: "category", label: "Collaborate with others", + link: { type: "doc", id: "docs/collaborate/collaborate-with-others" }, items: [ + "docs/collaborate/collaborate-with-others", "docs/collaborate/explore-projects", { type: "category", label: "Git version control", + link: { type: "doc", id: "docs/collaborate/git-version-control" }, items: [ "docs/collaborate/git-version-control", "docs/collaborate/git/version-control-basics", @@ -381,6 +430,7 @@ const sidebarSettings = { { type: "category", label: "Document your dbt projects", + link: { type: "doc", id: "docs/collaborate/documentation" }, items: [ "docs/collaborate/documentation", "docs/collaborate/build-and-view-your-docs", @@ -395,6 +445,7 @@ const sidebarSettings = { id: "docs/collaborate/govern/about-model-governance", }, items: [ + "docs/collaborate/govern/about-model-governance", "docs/collaborate/govern/model-access", "docs/collaborate/govern/model-contracts", "docs/collaborate/govern/model-versions", @@ -406,24 +457,38 @@ const sidebarSettings = { { type: "category", label: "Use the dbt Semantic Layer", + collapsed: true, link: { type: "doc", id: "docs/use-dbt-semantic-layer/dbt-sl" }, items: [ + "docs/use-dbt-semantic-layer/dbt-sl", "docs/use-dbt-semantic-layer/quickstart-sl", "docs/use-dbt-semantic-layer/setup-sl", - "docs/use-dbt-semantic-layer/avail-sl-integrations", "docs/use-dbt-semantic-layer/sl-architecture", + { + type: "category", + label: "Integrations", + link: { type: "doc", id: "docs/use-dbt-semantic-layer/avail-sl-integrations" }, + items: [ + "docs/use-dbt-semantic-layer/avail-sl-integrations", + "docs/use-dbt-semantic-layer/gsheets", + "docs/use-dbt-semantic-layer/tableau", + ], + }, ], }, { type: "category", label: "dbt Cloud APIs", collapsed: true, + link: { type: "doc", id: "docs/dbt-cloud-apis/overview" }, items: [ "docs/dbt-cloud-apis/overview", { type: "category", label: "Authentication", + link: { type: "doc", id: "docs/dbt-cloud-apis/authentication" }, items: [ + "docs/dbt-cloud-apis/authentication", "docs/dbt-cloud-apis/user-tokens", "docs/dbt-cloud-apis/service-tokens", ], @@ -433,6 +498,7 @@ const sidebarSettings = { label: "Administrative API", link: { type: "doc", id: "docs/dbt-cloud-apis/admin-cloud-api" }, items: [ + "docs/dbt-cloud-apis/admin-cloud-api", { type: "link", label: "API v2 (legacy docs)", @@ -455,18 +521,25 @@ const sidebarSettings = { label: "Discovery API", link: { type: "doc", id: "docs/dbt-cloud-apis/discovery-api" }, items: [ + "docs/dbt-cloud-apis/discovery-api", "docs/dbt-cloud-apis/discovery-use-cases-and-examples", "docs/dbt-cloud-apis/project-state", "docs/dbt-cloud-apis/discovery-querying", { type: "category", label: "Schema", + link: { type: "doc", id: "docs/dbt-cloud-apis/discovery-schema-environment" }, items: [ + "docs/dbt-cloud-apis/discovery-schema-environment", { type: "category", label: "Job", - link: { type: "doc", id: "docs/dbt-cloud-apis/discovery-schema-job" }, + link: { + type: "doc", + id: "docs/dbt-cloud-apis/discovery-schema-job", + }, items: [ + "docs/dbt-cloud-apis/discovery-schema-job", "docs/dbt-cloud-apis/discovery-schema-job-model", "docs/dbt-cloud-apis/discovery-schema-job-models", "docs/dbt-cloud-apis/discovery-schema-job-metric", @@ -486,11 +559,6 @@ const sidebarSettings = { ], }, { - type: "category", - label: "Environment", - link: { type: "doc", id: "docs/dbt-cloud-apis/discovery-schema-environment" }, - items: [ - { type: "category", label: "Applied", items: [ @@ -504,9 +572,7 @@ const sidebarSettings = { // items: [ // // insert pages here // ], - // }, - ], - }, + // }, ], }, ], @@ -516,6 +582,7 @@ const sidebarSettings = { label: "Semantic Layer APIs", link: { type: "doc", id: "docs/dbt-cloud-apis/sl-api-overview" }, items: [ + "docs/dbt-cloud-apis/sl-api-overview", "docs/dbt-cloud-apis/sl-jdbc", "docs/dbt-cloud-apis/sl-graphql", "docs/dbt-cloud-apis/sl-manifest", @@ -526,14 +593,33 @@ const sidebarSettings = { { type: "category", label: "Available dbt versions", + link: { type: "doc", id: "docs/dbt-versions/core" }, items: [ "docs/dbt-versions/core", "docs/dbt-versions/upgrade-core-in-cloud", "docs/dbt-versions/product-lifecycles", "docs/dbt-versions/experimental-features", + { + type: "category", + label: "dbt Core upgrade guides", + link: { + type: "generated-index", + title: "Version upgrade guides", + description: + "Learn what's new in the latest version of dbt Core.", + slug: "/docs/dbt-versions/core-upgrade", + }, + items: [ + { + type: "autogenerated", + dirName: "docs/dbt-versions/core-upgrade", + }, + ], + }, { type: "category", label: "dbt Cloud Release Notes", + link: { type: "doc", id: "docs/dbt-versions/dbt-cloud-release-notes" }, items: [ "docs/dbt-versions/dbt-cloud-release-notes", { @@ -622,6 +708,7 @@ const sidebarSettings = { "reference/resource-configs/fal-configs", "reference/resource-configs/oracle-configs", "reference/resource-configs/upsolver-configs", + "reference/resource-configs/starrocks-configs", ], }, { @@ -633,7 +720,6 @@ const sidebarSettings = { type: "category", label: "General properties", items: [ - "reference/resource-properties/access", "reference/resource-properties/columns", "reference/resource-properties/config", "reference/resource-properties/constraints", @@ -650,6 +736,7 @@ const sidebarSettings = { type: "category", label: "General configs", items: [ + "reference/resource-configs/access", "reference/resource-configs/alias", "reference/resource-configs/database", "reference/resource-configs/enabled", @@ -684,6 +771,7 @@ const sidebarSettings = { "reference/seed-properties", "reference/seed-configs", "reference/resource-configs/column_types", + "reference/resource-configs/delimiter", "reference/resource-configs/quote_columns", ], }, @@ -711,6 +799,7 @@ const sidebarSettings = { "reference/resource-configs/limit", "reference/resource-configs/severity", "reference/resource-configs/store_failures", + "reference/resource-configs/store_failures_as", "reference/resource-configs/where", ], }, @@ -864,7 +953,13 @@ const sidebarSettings = { { type: "category", label: "Database Permissions", - items: ["reference/snowflake-permissions"], + items: [ + "reference/database-permissions/about-database-permissions", + "reference/database-permissions/databricks-permissions", + "reference/database-permissions/postgres-permissions", + "reference/database-permissions/redshift-permissions", + "reference/database-permissions/snowflake-permissions", + ], }, ], guides: [ @@ -928,7 +1023,19 @@ const sidebarSettings = { }, { type: "category", - label: "Materializations best practices", + label: "How we build our dbt Mesh projects", + link: { + type: "doc", + id: "guides/best-practices/how-we-mesh/mesh-1-intro", + }, + items: [ + "guides/best-practices/how-we-mesh/mesh-2-structures", + "guides/best-practices/how-we-mesh/mesh-3-implementation", + ], + }, + { + type: "category", + label: "Materialization best practices", link: { type: "doc", id: "guides/best-practices/materializations/materializations-guide-1-guide-overview", @@ -1025,20 +1132,17 @@ const sidebarSettings = { { type: "category", label: "Versions", - link: { - type: "generated-index", - title: "Version migration guides", - description: - "Learn how to upgrade to the latest version of dbt Core.", - slug: "/guides/migration/versions", - }, items: [ - { - type: "autogenerated", - dirName: "guides/migration/versions", + "docs/dbt-versions/core-upgrade/upgrading-to-v1.7", + "docs/dbt-versions/core-upgrade/upgrading-to-v1.6", + "docs/dbt-versions/core-upgrade/upgrading-to-v1.5", + "docs/dbt-versions/core-upgrade/upgrading-to-v1.4", + "docs/dbt-versions/core-upgrade/upgrading-to-v1.3", + "docs/dbt-versions/core-upgrade/upgrading-to-v1.2", + "docs/dbt-versions/core-upgrade/upgrading-to-v1.1", + "docs/dbt-versions/core-upgrade/upgrading-to-v1.0", + ], }, - ], - }, { type: "category", label: "Tools", diff --git a/website/snippets/_adapters-trusted.md b/website/snippets/_adapters-trusted.md index 10af0218e22..7747ce16dec 100644 --- a/website/snippets/_adapters-trusted.md +++ b/website/snippets/_adapters-trusted.md @@ -2,7 +2,19 @@ + + + + diff --git a/website/snippets/_adapters-verified.md b/website/snippets/_adapters-verified.md index 7caf099b7d1..3cc1e800448 100644 --- a/website/snippets/_adapters-verified.md +++ b/website/snippets/_adapters-verified.md @@ -2,61 +2,60 @@ -* Install these adapters using the CLI as they're not currently supported in dbt Cloud.
diff --git a/website/snippets/_cloud-cli-flag.md b/website/snippets/_cloud-cli-flag.md new file mode 100644 index 00000000000..523591a438c --- /dev/null +++ b/website/snippets/_cloud-cli-flag.md @@ -0,0 +1,5 @@ +:::info Public preview functionality + +The dbt Cloud CLI is currently in [public preview](/docs/dbt-versions/product-lifecycles#dbt-cloud). Share feedback or request features you'd like to see on the [dbt community Slack](https://getdbt.slack.com/archives/C05M77P54FL). + +::: diff --git a/website/snippets/_cloud-environments-info.md b/website/snippets/_cloud-environments-info.md index 5388379dc34..2488e1d6c17 100644 --- a/website/snippets/_cloud-environments-info.md +++ b/website/snippets/_cloud-environments-info.md @@ -3,17 +3,17 @@ In dbt Cloud, there are two types of environments: - Deployment environment — Determines the settings used when jobs created within that environment are executed. -- Development environment — Determines the settings used in the dbt Cloud IDE for that particular dbt Cloud project. +- Development environment — Determines the settings used in the dbt Cloud IDE or dbt Cloud CLI, for that particular project. Each dbt Cloud project can only have a single development environment but can have any number of deployment environments. | | Development Environments | Deployment Environments | | --- | --- | --- | -| Determines settings for | dbt Cloud IDE | dbt Cloud Job runs | +| Determines settings for | dbt Cloud IDE or dbt Cloud CLI | dbt Cloud Job runs | | How many can I have in my project? | 1 | Any number | :::note -For users familiar with development on the CLI, each environment is roughly analogous to an entry in your `profiles.yml` file, with some additional information about your repository to ensure the proper version of code is executed. More info on dbt core environments [here](/docs/core/dbt-core-environments). +For users familiar with development on dbt Core, each environment is roughly analogous to an entry in your `profiles.yml` file, with some additional information about your repository to ensure the proper version of code is executed. More info on dbt core environments [here](/docs/core/dbt-core-environments). ::: ## Common environment settings @@ -38,7 +38,7 @@ Both development and deployment environments have a section called **General Set By default, all environments will use the default branch in your repository (usually the `main` branch) when accessing your dbt code. This is overridable within each dbt Cloud Environment using the **Default to a custom branch** option. This setting have will have slightly different behavior depending on the environment type: -- **Development**: determines which branch in the dbt Cloud IDE developers create branches from and open PRs against +- **Development**: determines which branch in the dbt Cloud IDE or dbt Cloud CLI developers create branches from and open PRs against. - **Deployment:** determines the branch is cloned during job executions for each environment. For more info, check out this [FAQ page on this topic](/faqs/Environments/custom-branch-settings)! @@ -47,10 +47,13 @@ For more info, check out this [FAQ page on this topic](/faqs/Environments/custom ### Extended attributes (Beta) :::important This feature is currently in beta - Extended Attributes is currently in [beta](/docs/dbt-versions/product-lifecycles?) for select users and is subject to change. ::: +:::note +Extended attributes are retrieved and applied only at runtime when `profiles.yml` is requested for a specific Cloud run. Extended attributes are currently _not_ taken into consideration for Cloud-specific features such as PrivateLink or SSH Tunneling that do not rely on `profiles.yml` values. +::: + Extended Attributes is a feature that allows users to set a flexible [profiles.yml](/docs/core/connect-data-platform/profiles.yml) snippet in their dbt Cloud Environment settings. It provides users with more control over environments (both deployment and development) and extends how dbt Cloud connects to the data platform within a given environment. Extended Attributes is a text box extension at the environment level that overrides connection or environment credentials, including any custom environment variables. You can set any YAML attributes that a dbt adapter accepts in its `profiles.yml`. @@ -59,7 +62,7 @@ Something to note, Extended Attributes doesn't mask secret values. We recommend
-If you're developing in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) or [orchestrating job runs](/docs/deploy/deployments), Extended Attributes parses through the provided YAML and extracts the `profiles.yml` attributes. For each individual attribute: +If you're developing in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud), [dbt Cloud CLI](/docs/cloud/cloud-cli-installation), or [orchestrating job runs](/docs/deploy/deployments), Extended Attributes parses through the provided YAML and extracts the `profiles.yml` attributes. For each individual attribute: - If the attribute exists in another source (such as your project settings), it will replace its value (like environment-level values) in the profile. It also overrides any custom environment variables. diff --git a/website/snippets/_enterprise-permissions-table.md b/website/snippets/_enterprise-permissions-table.md index 779c5bcb479..3eb313e0f5b 100644 --- a/website/snippets/_enterprise-permissions-table.md +++ b/website/snippets/_enterprise-permissions-table.md @@ -7,7 +7,7 @@ Key: Permissions: * Account-level permissions — Permissions related to management of the dbt Cloud account. For example, billing and account settings. -* Project-level permissions — Permissions related to the projects in dbt Cloud. For example, repos and access to the IDE. +* Project-level permissions — Permissions related to the projects in dbt Cloud. For example, repos and access to the IDE or dbt Cloud CLI. ### Account roles Account roles enable you to manage the dbt Cloud account and manage the account settings (for example, generating service tokens, inviting users, configuring SSO). They also provide project-level permissions. The **Account Admin** role is the highest level of access you can assign. @@ -20,8 +20,10 @@ Account roles enable you to manage the dbt Cloud account and manage the account | Audit logs | R | | | R | | | Auth provider | W | | | W | R | | Billing | W | W | | | R | +| Groups | W | | R | W | R | | Invitations | W | | W | W | R | | IP restrictions | W | | | W | R | +| Licenses | W | | W | W | R | | Members | W | | W | W | R | | Project (create) | W | | W | | | | Public models | R | R | R | R | R | @@ -34,25 +36,23 @@ Account roles enable you to manage the dbt Cloud account and manage the account |:-------------------------|:-------------:|:-------------:|:---------------:|:--------------:|:------:| | Connections | W | | W | | R | | Credentials | W | | W | | R | -| Custom env. variables | W | | W | | R | +| Custom env. variables | W | | W | | R | | dbt adapters | W | | W | | R | -| Develop (IDE) | W | | W | | | +| Develop (IDE or dbt Cloud CLI) | W | | W | | | | Environments | W | | W | | R | -| Groups | W | | R | W | R | | Jobs | W | | W | | R | -| Licenses | W | | W | W | R | | Metadata | R | | R | | R | | Permissions | W | | W | W | R | | Profile | W | | W | | R | | Projects | W | | W | R | R | | Repositories | W | | W | | R | | Runs | W | | W | | R | -| Semantic Layer Config | W | | W | | R | +| Semantic Layer Config | W | | W | | R | ### Project role permissions -The project roles enable you to work within the projects in various capacities. They primarily provide access to project-level permissions such as repos and the IDE, but may also provide some account-level permissions. +The project roles enable you to work within the projects in various capacities. They primarily provide access to project-level permissions such as repos and the IDE or dbt Cloud CLI, but may also provide some account-level permissions. #### Account permissions for project roles @@ -61,12 +61,14 @@ The project roles enable you to work within the projects in various capacities. | Account settings | R | | R | | R | | | | | | R | | | Auth provider | | | | | | | | | | | | | | Billing | | | | | | | | | | | | | -| Invitations | W | R | R | R | R | R | R | | | R | R | | -| Members | W | | R | R | R | | | | | R | R | | +| Groups | R | | R | R | R | | | | | R | R | | +| Invitations | W | R | R | R | R | R | R | | | R | R | | +| Licenses | W | R | R | R | R | R | R | | | | R | | +| Members | W | | R | R | R | | | | | R | R | | | Project (create) | | | | | | | | | | | | | -| Public models | R | R | R | R | R | R | R | R | R | R | R | R | +| Public models | R | R | R | R | R | R | R | R | R | R | R | R | | Service tokens | | | | | | | | | | | | | -| Webhooks | W | | | W | | | | | | | | W | +| Webhooks | W | | | W | | | | | | | | W | #### Project permissions for project roles @@ -74,13 +76,11 @@ The project roles enable you to work within the projects in various capacities. |--------------------------|:-----:|:-------:|:--------------:|:---------:|:---------:|:---------:|:-----------:|:--------:|:--------------:|:-----------:|:----------:|:------:| | Connections | W | R | W | R | R | R | | | | R | R | | | Credentials | W | W | W | W | R | W | | | | R | R | | -| Custom env. variables | W | W | W | W | W | W | R | | | R | W | | +| Custom env. variables | W | W | W | W | W | W | R | | | R | W | | | dbt adapters | W | W | W | W | R | W | | | | R | R | | -| Develop (IDE) | W | W | | W | | | | | | | | | +| Develop (IDE or dbt Cloud CLI) | W | W | | W | | | | | | | | | | Environments | W | R | R | R | R | W | R | | | R | R | | -| Groups | R | | R | R | R | | | | | R | R | | | Jobs | W | R | R | W | R | W | R | | | R | R | | -| Licenses | W | R | R | R | R | R | R | | | | R | | | Metadata | R | R | R | R | R | R | R | R | | R | R | | | Permissions | W | | R | R | R | | | | | | W | | | Profile | W | R | W | R | R | R | | | | R | R | | diff --git a/website/snippets/_manifest-versions.md b/website/snippets/_manifest-versions.md new file mode 100644 index 00000000000..c9b3e7af6ec --- /dev/null +++ b/website/snippets/_manifest-versions.md @@ -0,0 +1,11 @@ + +| dbt Core version | Manifest version | +|------------------|---------------------------------------------------------------| +| v1.7 | [v11](https://schemas.getdbt.com/dbt/manifest/v11/index.html) | +| v1.6 | [v10](https://schemas.getdbt.com/dbt/manifest/v10/index.html) | +| v1.5 | [v9](https://schemas.getdbt.com/dbt/manifest/v9/index.html) | +| v1.4 | [v8](https://schemas.getdbt.com/dbt/manifest/v8/index.html) | +| v1.3 | [v7](https://schemas.getdbt.com/dbt/manifest/v7/index.html) | +| v1.2 | [v6](https://schemas.getdbt.com/dbt/manifest/v6/index.html) | +| v1.1 | [v5](https://schemas.getdbt.com/dbt/manifest/v5/index.html) | +| v1.0 | [v4](https://schemas.getdbt.com/dbt/manifest/v4/index.html) | \ No newline at end of file diff --git a/website/snippets/_microsoft-adapters-soon.md b/website/snippets/_microsoft-adapters-soon.md new file mode 100644 index 00000000000..c3f30ef0939 --- /dev/null +++ b/website/snippets/_microsoft-adapters-soon.md @@ -0,0 +1,3 @@ +:::tip Coming soon +dbt Cloud support for the Microsoft Fabric and Azure Synapse Analytics adapters is coming soon! +::: \ No newline at end of file diff --git a/website/snippets/_new-sl-setup.md b/website/snippets/_new-sl-setup.md index b802db9c5ae..ad248bc3ca9 100644 --- a/website/snippets/_new-sl-setup.md +++ b/website/snippets/_new-sl-setup.md @@ -1,13 +1,13 @@ You can set up the dbt Semantic Layer in dbt Cloud at the environment and project level. Before you begin: -- You must have a dbt Cloud Team or Enterprise [multi-tenant](/docs/cloud/about-cloud/regions-ip-addresses) deployment, hosted in North America. +- You must have a dbt Cloud Team or Enterprise [multi-tenant](/docs/cloud/about-cloud/regions-ip-addresses) deployment. Single-tenant coming soon. - You must be part of the Owner group, and have the correct [license](/docs/cloud/manage-access/seats-and-users) and [permissions](/docs/cloud/manage-access/self-service-permissions) to configure the Semantic Layer: * Enterprise plan — Developer license with Account Admin permissions. Or Owner with a Developer license, assigned Project Creator, Database Admin, or Admin permissions. * Team plan — Owner with a Developer license. - You must have a successful run in your new environment. :::tip -If you're using the legacy Semantic Layer, we **highly** recommend you [upgrade your dbt version](/docs/dbt-versions/upgrade-core-in-cloud) to dbt v1.6 or higher to use the new dbt Semantic Layer. Refer to the dedicated [migration guide](/guides/migration/sl-migration) for more info. +If you're using the legacy Semantic Layer, dbt Labs strongly recommends that you [upgrade your dbt version](/docs/dbt-versions/upgrade-core-in-cloud) to dbt version 1.6 or newer to use the latest dbt Semantic Layer. Refer to the dedicated [migration guide](/guides/migration/sl-migration) for details. ::: 1. In dbt Cloud, create a new [deployment environment](/docs/deploy/deploy-environments#create-a-deployment-environment) or use an existing environment on dbt 1.6 or higher. diff --git a/website/snippets/_sl-connect-and-query-api.md b/website/snippets/_sl-connect-and-query-api.md new file mode 100644 index 00000000000..429f41c3bf6 --- /dev/null +++ b/website/snippets/_sl-connect-and-query-api.md @@ -0,0 +1,10 @@ +You can query your metrics in a JDBC-enabled tool or use existing first-class integrations with the dbt Semantic Layer. + +You must have a dbt Cloud Team or Enterprise [multi-tenant](/docs/cloud/about-cloud/regions-ip-addresses) deployment. Single-tenant coming soon. + +- To learn how to use the JDBC or GraphQL API and what tools you can query it with, refer to [dbt Semantic Layer APIs](/docs/dbt-cloud-apis/sl-api-overview). + + * To authenticate, you need to [generate a service token](/docs/dbt-cloud-apis/service-tokens) with Semantic Layer Only and Metadata Only permissions. + * Refer to the [SQL query syntax](/docs/dbt-cloud-apis/sl-jdbc#querying-the-api-for-metric-metadata) to query metrics using the API. + +- To learn more about the sophisticated integrations that connect to the dbt Semantic Layer, refer to [Available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations) for more info. diff --git a/website/snippets/_sl-create-semanticmodel.md b/website/snippets/_sl-create-semanticmodel.md index bc4276efcb6..6e0376ab10b 100644 --- a/website/snippets/_sl-create-semanticmodel.md +++ b/website/snippets/_sl-create-semanticmodel.md @@ -1,4 +1,4 @@ -The following steps will walk you through setting up semantic models, which you can do with the dbt Cloud IDE or the CLI. Semantic models consist of [entities](/docs/build/entities), [dimensions](/docs/build/dimensions), and [measures](/docs/build/measures). +The following steps describe how to set up semantic models. Semantic models consist of [entities](/docs/build/entities), [dimensions](/docs/build/dimensions), and [measures](/docs/build/measures). We highly recommend you read the overview of what a [semantic model](/docs/build/semantic-models) is before getting started. If you're working in the [Jaffle shop example](https://github.com/dbt-labs/jaffle-sl-template), delete the `orders.yml` config or delete the .yml extension so it's ignored during parsing. **We'll be rebuilding it step by step in this example.** diff --git a/website/snippets/_sl-define-metrics.md b/website/snippets/_sl-define-metrics.md index 29af3f5b7c3..3734e819c1b 100644 --- a/website/snippets/_sl-define-metrics.md +++ b/website/snippets/_sl-define-metrics.md @@ -1,4 +1,4 @@ -Now that you've created your first semantic model, it's time to define your first metric! You can define metrics with the dbt Cloud IDE or CLI. +Now that you've created your first semantic model, it's time to define your first metric! You can define metrics with the dbt Cloud IDE or command line. MetricFlow supports different metric types like [simple](/docs/build/simple), [ratio](/docs/build/ratio), [cumulative](/docs/build/cumulative), and [derived](/docs/build/derived). It's recommended that you read the [metrics overview docs](/docs/build/metrics-overview) before getting started. diff --git a/website/snippets/_sl-install-metricflow.md b/website/snippets/_sl-install-metricflow.md deleted file mode 100644 index 73e60d34e85..00000000000 --- a/website/snippets/_sl-install-metricflow.md +++ /dev/null @@ -1,8 +0,0 @@ -Install the [MetricFlow CLI](/docs/build/metricflow-cli) as an extension of a dbt adapter from PyPI. The MetricFlow CLI is compatible with Python versions 3.8, 3.9, 3.10 and 3.11 - -Use pip install `metricflow` and your [dbt adapter](/docs/supported-data-platforms): - -- Create or activate your virtual environment. `python -m venv venv` or `source your-venv/bin/activate` -- Run `pip install "dbt-metricflow[your_adapter_name]"` - - You must specify `[your_adapter_name]`. - - For example, run `pip install "dbt-metricflow[snowflake]"` if you use a Snowflake adapter. diff --git a/website/snippets/_sl-partner-links.md b/website/snippets/_sl-partner-links.md index e9cc6af3564..c97c682171b 100644 --- a/website/snippets/_sl-partner-links.md +++ b/website/snippets/_sl-partner-links.md @@ -1,11 +1,105 @@ - -The dbt Semantic Layer integrations are capable of querying dbt metrics, importing definitions, surfacing the underlying data in partner tools, and more. These are the following tools that integrate with the dbt Semantic Layer: +The following tools integrate with the dbt Semantic Layer: -1. **Mode** — To learn more about integrating with Mode, check out their [documentation](https://mode.com/help/articles/supported-databases/#dbt-semantic-layer) for more info. -2. **Hex** — To learn more about integrating with Hex, check out their [documentation](https://learn.hex.tech/docs/connect-to-data/data-connections/dbt-integration#dbt-semantic-layer-integration) for more info. Additionally, refer to [dbt Semantic Layer cells](https://learn.hex.tech/docs/logic-cell-types/transform-cells/dbt-metrics-cells) to set up SQL cells in Hex. -3. **Google Sheets** — Google Sheets integration coming soon. -4. **Tools that allows you to write SQL** — They must meet one of the two criteria: - * Supports a generic JDBC driver option (such as DataGrip) or - * Supports Dremio and uses ArrowFlightSQL driver version 12.0.0 or higher. + -Before you connect to these tools, you'll need to first [set up the dbt Semantic Layer](/docs/use-dbt-semantic-layer/setup-sl) and [generate a service token](/docs/dbt-cloud-apis/service-tokens) to create a Semantic Layer Only and Metadata Only service token. +
+ + + + + + + +
+ + + + +
+ +
+ + + + +
+ +
+ + + + +
+ +
+ + + + +
+ +
+ + + + +
+ +

+ +Before you connect to these tools, you'll need to first [set up the dbt Semantic Layer](/docs/use-dbt-semantic-layer/setup-sl) and [generate a service token](/docs/dbt-cloud-apis/service-tokens) to create **Semantic Layer Only** and **Metadata Only** permissions. diff --git a/website/snippets/_sl-plan-info.md b/website/snippets/_sl-plan-info.md index 5fba18de6bb..083ab2209bc 100644 --- a/website/snippets/_sl-plan-info.md +++ b/website/snippets/_sl-plan-info.md @@ -1,2 +1,2 @@ -To define and query metrics with the {props.product}, you must be on a {props.plan} multi-tenant plan, {props.instance} (Additional region support coming soon).

The re-released dbt Semantic Layer is available on dbt v1.6 or higher. dbt Core users can use the MetricFlow CLI to define metrics in their local project, but won't be able to dynamically query them with integrated tools.


+To define and query metrics with the {props.product}, you must be on a {props.plan} multi-tenant plan .


diff --git a/website/snippets/_sl-run-prod-job.md b/website/snippets/_sl-run-prod-job.md new file mode 100644 index 00000000000..a637b0b431e --- /dev/null +++ b/website/snippets/_sl-run-prod-job.md @@ -0,0 +1,7 @@ +Once you’ve defined metrics in your dbt project, you can perform a job run in your deployment environment in dbt Cloud to materialize your metrics. The deployment environment is only supported for the dbt Semantic Layer currently. + +1. Select **Deploy** from the top navigation bar. +2. Select **Jobs** to rerun the job with the most recent code in the deployment environment. +3. Your metric should appear as a red node in the dbt Cloud IDE and dbt directed acyclic graphs (DAG). + + diff --git a/website/snippets/_sl-test-and-query-metrics.md b/website/snippets/_sl-test-and-query-metrics.md index b250fac4f31..43ebd929cb3 100644 --- a/website/snippets/_sl-test-and-query-metrics.md +++ b/website/snippets/_sl-test-and-query-metrics.md @@ -1,31 +1,68 @@ -:::important Testing and querying metrics in the dbt Cloud IDE not yet supported +This section explains how you can test and run MetricFlow commands with dbt Cloud or dbt Core (dbt Cloud IDE support coming soon). dbt Cloud IDE users can skip to [Run a production job](#run-a-production-job) to run a model. -Support for testing or querying metrics in the dbt Cloud IDE is not available in the current beta but is coming soon. +:::important Testing and querying metrics in the dbt Cloud IDE is currently not supported -You can use the **Preview** or **Compile** buttons in the IDE to run semantic validations and make sure your metrics are defined. You can [dynamically query metrics](#connect-and-query-api) with integrated tools on a dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) plan using the [dbt Semantic Layer APIs](/docs/dbt-cloud-apis/sl-api-overview). +Support for running [MetricFlow commands](/docs/build/metricflow-commands) in the dbt Cloud IDE is not available but is coming soon. -Currently, you can define and test metrics using the MetricFlow CLI. dbt Cloud IDE support is coming soon. Alternatively, you can test using SQL client tools like DataGrip, DBeaver, or RazorSQL. +You can use the **Preview** or **Compile** buttons in the IDE to run semantic validations and make sure your metrics are defined. Alternatively, you can run commands with the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) or with SQL client tools like DataGrip, DBeaver, or RazorSQL. ::: -This section will explain how you can test and query metrics using the MetricFlow CLI (dbt Cloud IDE support coming soon). + -Before you begin, you'll need to install the [MetricFlow CLI](/docs/build/metricflow-cli) package and make sure you run at least one model. -### Install MetricFlow -import InstallMetricFlow from '/snippets/_sl-install-metricflow.md'; + - -### Query and commit your metrics using the CLI +This section is for people using the dbt Cloud CLI (support for dbt Cloud IDE is coming soon). With dbt Cloud: -MetricFlow needs a `semantic_manifest.json` in order to build a semantic graph. To generate a semantic_manifest.json artifact run `dbt parse`. This will create the file in your `/target` directory. If you're working from the Jaffle shop example, run `dbt seed && dbt run` before preceding to ensure the data exists in your warehouse. +- You can run MetricFlow commands after installing the dbt Cloud CLI. They're integrated with dbt Cloud so you can use them immediately. +- Your account will automatically manage version control for you. -1. Make sure you have the MetricFlow CLI installed and up to date. -2. Run `mf --help` to confirm you have MetricFlow installed and view the available commands. -3. Run `mf query --metrics --group-by ` to query the metrics and dimensions. For example, `mf query --metrics order_total --group-by metric_time` -4. Verify that the metric values are what you expect. To further understand how the metric is being generated, you can view the generated SQL if you type `--explain` in the CLI. -5. Run `mf validate-configs` to run validation on your semantic models and metrics. -6. Commit and merge the code changes that contain the metric definitions. +To get started: + +1. Make sure you've installed the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation). +2. Navigate to your dbt project directory. +3. Run a dbt command, such as `dbt parse`, `dbt run`, `dbt compile`, or `dbt build`. If you don't, you'll receive an error message that begins with: "ensure that you've ran an artifacts...." + - MetricFlow builds a semantic graph and generates a `semantic_manifest.json` file in dbt Cloud, which is stored in the `/target` directory. If using the Jaffle shop example, run `dbt seed && dbt run` to ensure the required data is in your data platform before proceeding. + +4. Run `dbt sl --help` to confirm you have MetricFlow installed and that you can view the available commands. +5. Run `dbt sl query --metrics --group-by ` to query the metrics and dimensions. For example, `dbt sl query --metrics order_total --group-by metric_time` +6. Verify that the metric values are what you expect. To further understand how the metric is being generated, you can view the generated SQL if you type `--compile` in the command line. +7. Commit and merge the code changes that contain the metric definitions. To streamline your metric querying process, you can connect to the [dbt Semantic Layer APIs](/docs/dbt-cloud-apis/sl-api-overview) to access your metrics programmatically. For SQL syntax, refer to [Querying the API for metric metadata](/docs/dbt-cloud-apis/sl-jdbc#querying-the-api-for-metric-metadata) to query metrics using the API. + + + + + + + +This step is for dbt Core users only. MetricFlow is compatible with Python versions 3.8, 3.9, 3.10 and 3.11. You need to use `pip` to install MetricFlow on Windows or Linux operating systems: + +:::note +The dbt Cloud CLI is strongly recommended to define and query metrics for your dbt project in dbt Cloud or dbt Core with MetricFlow. If you're using dbt Core, you'll need to manage versioning between dbt Core, your adapter, and MetricFlow. +::: + + +1. Install [MetricFlow](/docs/build/metricflow-commands) as an extension of a dbt adapter from PyPI. +2. Create or activate your virtual environment with `python -m venv venv` or `source your-venv/bin/activate`. +3. Run `pip install dbt-metricflow`. + - You can install MetricFlow using PyPI as an extension of your dbt adapter in the command line. To install the adapter, run `pip install "dbt-metricflow[your_adapter_name]"` and add the adapter name at the end of the command. As an example for a Snowflake adapter, run `pip install "dbt-metricflow[snowflake]"`. + - You'll need to manage versioning between dbt Core, your adapter, and MetricFlow. +4. Run `dbt parse`. This allows MetricFlow to build a semantic graph and generate a `semantic_manifest.json`. + - This creates the file in your `/target` directory. If you're working from the Jaffle shop example, run `dbt seed && dbt run` before proceeding to ensure the data exists in your warehouse. +5. Run `mf --help` to confirm you have MetricFlow installed and that you can view the available commands. +6. Run `mf query --metrics --group-by ` to query the metrics and dimensions. For example, `mf query --metrics order_total --group-by metric_time`. +7. Verify that the metric values are what you expect. To further understand how the metric is being generated, you can view the generated SQL if you type `--explain` in the command line. +8. Run `mf validate-configs` to run validation on your semantic models and metrics. +9. Commit and merge the code changes that contain the metric definitions. + +To streamline your metric querying process, you can connect to the [dbt Semantic Layer APIs](/docs/dbt-cloud-apis/sl-api-overview) to access your metrics programmatically. For SQL syntax, refer to [Querying the API for metric metadata](/docs/dbt-cloud-apis/sl-jdbc#querying-the-api-for-metric-metadata) to query metrics using the API. + + + + + + diff --git a/website/snippets/_upgrade-move.md b/website/snippets/_upgrade-move.md new file mode 100644 index 00000000000..7572077fd1b --- /dev/null +++ b/website/snippets/_upgrade-move.md @@ -0,0 +1,5 @@ +:::important Upgrade Guides Are Moving + +The location of the dbt Core upgrade guides has changed, and they will soon be removed from `Guides`. The new location is in the `Docs` tab under `Available dbt versions`. You have been redirected to the new URL, so please update any saved links and bookmarks. + +::: \ No newline at end of file diff --git a/website/snippets/_v2-sl-prerequisites.md b/website/snippets/_v2-sl-prerequisites.md index 9fdc3b53143..c80db4d1c8f 100644 --- a/website/snippets/_v2-sl-prerequisites.md +++ b/website/snippets/_v2-sl-prerequisites.md @@ -1,17 +1,15 @@ -To use the Semantic Layer, you must: - -- Have a dbt Cloud Team or Enterprise [multi-tenant](/docs/cloud/about-cloud/regions-ip-addresses) deployment, hosted in North America. +- Have a dbt Cloud Team or Enterprise [multi-tenant](/docs/cloud/about-cloud/regions-ip-addresses) deployment. Single-tenant coming soon. - Have both your production and development environments running dbt version 1.6 or higher. Refer to [upgrade in dbt Cloud](/docs/dbt-versions/upgrade-core-in-cloud) for more info. -- Use Snowflake, BigQuery, Databricks, or Redshift (dbt Cloud Postgres support coming soon). +- Use Snowflake, BigQuery, Databricks, or Redshift. - Create a successful run in the environment where you configure the Semantic Layer. - **Note:** Semantic Layer currently supports the Deployment environment for querying. (_development querying experience coming soon_) - Set up the [Semantic Layer API](/docs/dbt-cloud-apis/sl-api-overview) in the integrated tool to import metric definitions. - - **Note:** To access the API and query metrics in downstream tools, you must have a dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) account. dbt Core or Developer accounts can define metrics with the [MetricFlow CLI](/docs/build/metricflow-cli) or [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) but won't be able to dynamically query them.
-- Understand [MetricFlow's](/docs/build/about-metricflow) key concepts, which powers the revamped dbt Semantic Layer. - + - To access the API and query metrics in downstream tools, you must have a dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) account. dbt Core or Developer accounts can define metrics but won't be able to dynamically query them.
+- Understand [MetricFlow's](/docs/build/about-metricflow) key concepts, which powers the latest dbt Semantic Layer. +- Note that SSH tunneling for [Postgres and Redshift](/docs/cloud/connect-data-platform/connect-redshift-postgresql-alloydb) connections, [PrivateLink](/docs/cloud/secure/about-privatelink), and [Single sign-on (SSO)](/docs/cloud/manage-access/sso-overview) isn't supported yet.
diff --git a/website/snippets/connect-starburst-trino/roles-starburst-enterprise.md b/website/snippets/connect-starburst-trino/roles-starburst-enterprise.md index ba11508f1b4..f832d52be20 100644 --- a/website/snippets/connect-starburst-trino/roles-starburst-enterprise.md +++ b/website/snippets/connect-starburst-trino/roles-starburst-enterprise.md @@ -1,3 +1,6 @@ -[comment: For context, the section title used for this snippet is "Roles in Starburst Enterprise" ]: # +[comment: For context, the section title used for this snippet is "Roles in Starburst Enterprise" ]: # -If connecting to a Starburst Enterprise cluster with built-in access controls enabled, you can't add the role as a suffix to the username, so the default role for the provided username is used instead. +If connecting to a Starburst Enterprise cluster with built-in access controls +enabled, you must specify a role using the format detailed in [Additional +parameters](#additional-parameters). If a role is not specified, the default +role for the provided username is used. \ No newline at end of file diff --git a/website/snippets/core-versions-table.md b/website/snippets/core-versions-table.md index 431e1f08b4c..6ec1cb01e32 100644 --- a/website/snippets/core-versions-table.md +++ b/website/snippets/core-versions-table.md @@ -2,14 +2,14 @@ | dbt Core | Initial Release | Support Level | Critical Support Until | |------------------------------------------------------------|-----------------|----------------|-------------------------| -| [**v1.7**](/guides/migration/versions/upgrading-to-v1.7) (beta)| Oct 26, 2023 | - | - | -| [**v1.6**](/guides/migration/versions/upgrading-to-v1.6) | Jul 31, 2023 | Active | Jul 30, 2024 | -| [**v1.5**](/guides/migration/versions/upgrading-to-v1.5) | Apr 27, 2023 | Critical | Apr 27, 2024 | -| [**v1.4**](/guides/migration/versions/upgrading-to-v1.4) | Jan 25, 2023 | Critical | Jan 25, 2024 | -| [**v1.3**](/guides/migration/versions/upgrading-to-v1.3) | Oct 12, 2022 | Critical | Oct 12, 2023 | -| [**v1.2**](/guides/migration/versions/upgrading-to-v1.2) | Jul 26, 2022 | End of Life* ⚠️ | Jul 26, 2023 | -| [**v1.1**](/guides/migration/versions/upgrading-to-v1.1) ⚠️ | Apr 28, 2022 | Deprecated ⛔️ | Deprecated ⛔️ | -| [**v1.0**](/guides/migration/versions/upgrading-to-v1.0) ⚠️ | Dec 3, 2021 | Deprecated ⛔️ | Deprecated ⛔️ | +| [**v1.7**](/docs/dbt-versions/core-upgrade/upgrading-to-v1.7) | Nov 2, 2023 | Active | Nov 1, 2024 | +| [**v1.6**](/docs/dbt-versions/core-upgrade/upgrading-to-v1.6) | Jul 31, 2023 | Active | Jul 30, 2024 | +| [**v1.5**](/docs/dbt-versions/core-upgrade/upgrading-to-v1.5) | Apr 27, 2023 | Critical | Apr 27, 2024 | +| [**v1.4**](/docs/dbt-versions/core-upgrade/upgrading-to-v1.4) | Jan 25, 2023 | Critical | Jan 25, 2024 | +| [**v1.3**](/docs/dbt-versions/core-upgrade/upgrading-to-v1.3) | Oct 12, 2022 | End of Life* ⚠️ | Oct 12, 2023 | +| [**v1.2**](/docs/dbt-versions/core-upgrade/upgrading-to-v1.2) | Jul 26, 2022 | End of Life* ⚠️ | Jul 26, 2023 | +| [**v1.1**](/docs/dbt-versions/core-upgrade/upgrading-to-v1.1) ⚠️ | Apr 28, 2022 | Deprecated ⛔️ | Deprecated ⛔️ | +| [**v1.0**](/docs/dbt-versions/core-upgrade/upgrading-to-v1.0) ⚠️ | Dec 3, 2021 | Deprecated ⛔️ | Deprecated ⛔️ | | **v0.X** ⛔️ | (Various dates) | Deprecated ⛔️ | Deprecated ⛔️ | _*All versions of dbt Core since v1.0 are available in dbt Cloud until further notice. Versions that are EOL do not receive any fixes. For the best support, we recommend upgrading to a version released within the past 12 months._ ### Planned future releases @@ -18,6 +18,5 @@ _Future release dates are tentative and subject to change._ | dbt Core | Planned Release | Critical & dbt Cloud Support Until | |----------|-----------------|-------------------------------------| -| **v1.7** | _Oct 2023_ | _Oct 2024_ | | **v1.8** | _Jan 2024_ | _Jan 2025_ | | **v1.9** | _Apr 2024_ | _Apr 2025_ | diff --git a/website/snippets/metadata-api-prerequisites.md b/website/snippets/metadata-api-prerequisites.md index 35532e28bdc..6e2d1550223 100644 --- a/website/snippets/metadata-api-prerequisites.md +++ b/website/snippets/metadata-api-prerequisites.md @@ -2,5 +2,5 @@ - dbt Cloud [multi-tenant](/docs/cloud/about-cloud/tenancy#multi-tenant) or [single tenant](/docs/cloud/about-cloud/tenancy#single-tenant) account - You must be on a [Team or Enterprise plan](https://www.getdbt.com/pricing/) -- Your projects must be on dbt version 1.0 or higher. Refer to [Version migration guides](/guides/migration/versions) to upgrade +- Your projects must be on dbt version 1.0 or higher. Refer to [Version migration guides](/docs/dbt-versions/core-upgrade) to upgrade diff --git a/website/snippets/quickstarts/schedule-a-job.md b/website/snippets/quickstarts/schedule-a-job.md index 59d428bdfaa..ab8f4350dbf 100644 --- a/website/snippets/quickstarts/schedule-a-job.md +++ b/website/snippets/quickstarts/schedule-a-job.md @@ -24,15 +24,16 @@ Jobs are a set of dbt commands that you want to run on a schedule. For example, As the `jaffle_shop` business gains more customers, and those customers create more orders, you will see more records added to your source data. Because you materialized the `customers` model as a table, you'll need to periodically rebuild your table to ensure that the data stays up-to-date. This update will happen when you run a job. -1. After creating your deployment environment, you should be directed to the page for new environment. If not, select **Deploy** in the upper left, then click **Jobs**. -2. Click **Create one** and provide a name, for example "Production run", and link to the Environment you just created. -3. Scroll down to "Execution Settings" and select **Generate docs on run**. -4. Under "Commands," add this command as part of your job if you don't see them: - * `dbt build` -5. For this exercise, do _not_ set a schedule for your project to run — while your organization's project should run regularly, there's no need to run this example project on a schedule. Scheduling a job is sometimes referred to as _deploying a project_. -6. Select **Save**, then click **Run now** to run your job. -7. Click the run and watch its progress under "Run history." -8. Once the run is complete, click **View Documentation** to see the docs for your project. +1. After creating your deployment environment, you should be directed to the page for a new environment. If not, select **Deploy** in the upper left, then click **Jobs**. +2. Click **Create one** and provide a name, for example, "Production run", and link to the Environment you just created. +3. Scroll down to the **Execution Settings** section. +4. Under **Commands**, add this command as part of your job if you don't see it: + * `dbt build` +5. Select the **Generate docs on run** checkbox to automatically [generate updated project docs](/docs/collaborate/build-and-view-your-docs) each time your job runs. +6. For this exercise, do _not_ set a schedule for your project to run — while your organization's project should run regularly, there's no need to run this example project on a schedule. Scheduling a job is sometimes referred to as _deploying a project_. +7. Select **Save**, then click **Run now** to run your job. +8. Click the run and watch its progress under "Run history." +9. Once the run is complete, click **View Documentation** to see the docs for your project. :::tip Congratulations 🎉! You've just deployed your first dbt project! diff --git a/website/src/components/stoplight/index.js b/website/src/components/stoplight/index.js index bff43dd27c8..7baf2991b4f 100644 --- a/website/src/components/stoplight/index.js +++ b/website/src/components/stoplight/index.js @@ -1,17 +1,16 @@ import { API } from "@stoplight/elements"; import React from "react"; import useBaseUrl from "@docusaurus/useBaseUrl"; + export default function Stoplight({ version }) { if (!["v1", "v2", "v3", "private"].includes(version)) { return null; } + return ( <> - + + div>button.button:hover { display: none; } +.card-container { + position: relative; +} + +.external-link { + position: absolute; + top: 0; + right: 0; + margin: 10px; + color: #818589; /* You can adjust the color as needed */ +} + @media (max-width: 996px) { .quickstart-container { flex-direction: column; diff --git a/website/static/css/stoplight-base.css b/website/static/css/stoplight-base.css new file mode 100644 index 00000000000..98c2043e757 --- /dev/null +++ b/website/static/css/stoplight-base.css @@ -0,0 +1,111 @@ +/* + DO NOT EDIT - any customizations should be made in stoplight-custom.css + + Context: + The styles shipped with Stoplight include global CSS resets which override base navigation styles in Docusaurus. + Until Stoplight ships a hosted version of their CSS with resets correctly scoped, we include a modified version + here which removes certain styles that were causing issues. + + Original CSS location: node_modules/@stoplight/elements/styles.min.css + + GitHub issue tracking this bug: + https://github.com/stoplightio/elements/issues/1792 +*/ + +/* Reset styles with problematic globals commented out */ +blockquote,dd,dl,figure,h1,h2,h3,h4,h5,h6,hr,p,pre{ + margin:0 +} +button{ + background-color:transparent; + background-image:none +} +:focus{ + outline:none +} +fieldset,ol,ul{ + margin:0; + padding:0 +} +ol,ul{ + list-style:none +} +html{ + /* font-family:var(--font-ui); */ + line-height:1.5 +} +body{ + text-rendering:optimizeSpeed; + /* font-family:inherit; */ + line-height:inherit; + margin:0; + min-height:100vh +} +*,:after,:before{ + border:0 solid var(--color-border,currentColor); + box-sizing:border-box +} +hr{ + border-top-width:1px +} +img{ + border-style:solid +} +textarea{ + resize:vertical +} +input::-ms-input-placeholder,textarea::-ms-input-placeholder{ + color:#a1a1aa +} +input::placeholder,textarea::placeholder{ + color:#a1a1aa +} +[role=button],button{ + cursor:pointer +} +table{ + border-collapse:collapse +} +h1,h2,h3,h4,h5,h6{ + font-size:inherit; + font-weight:inherit +} +a{ + color:inherit; + text-decoration:inherit +} +button,input,optgroup,select,textarea{ + color:inherit; + line-height:inherit; + padding:0 +} +code,kbd,pre,samp{ + font-family:var(--font-mono) +} +audio,canvas,embed,iframe,img,object,svg,video{ + display:block; + vertical-align:middle +} +img,video{ + height:auto; + max-width:100% +} +button{ + font-family:var(--font-ui) +} +select{ + -moz-appearance:none; + -webkit-appearance:none +} +select::-ms-expand{ + display:none +} +select{ + font-size:inherit +} +iframe{ + border:0 +} + +/* Original non reset styles */ +.sl-stack--horizontal.sl-stack--1>:not(style)~:not(style){margin-left:4px}.sl-stack--vertical.sl-stack--1>:not(style)~:not(style){margin-top:4px}.sl-stack--horizontal.sl-stack--2>:not(style)~:not(style){margin-left:8px}.sl-stack--vertical.sl-stack--2>:not(style)~:not(style){margin-top:8px}.sl-stack--horizontal.sl-stack--3>:not(style)~:not(style){margin-left:12px}.sl-stack--vertical.sl-stack--3>:not(style)~:not(style){margin-top:12px}.sl-stack--horizontal.sl-stack--4>:not(style)~:not(style){margin-left:16px}.sl-stack--vertical.sl-stack--4>:not(style)~:not(style){margin-top:16px}.sl-stack--horizontal.sl-stack--5>:not(style)~:not(style){margin-left:20px}.sl-stack--vertical.sl-stack--5>:not(style)~:not(style){margin-top:20px}.sl-stack--horizontal.sl-stack--6>:not(style)~:not(style){margin-left:24px}.sl-stack--vertical.sl-stack--6>:not(style)~:not(style){margin-top:24px}.sl-stack--horizontal.sl-stack--7>:not(style)~:not(style){margin-left:28px}.sl-stack--vertical.sl-stack--7>:not(style)~:not(style){margin-top:28px}.sl-stack--horizontal.sl-stack--8>:not(style)~:not(style){margin-left:32px}.sl-stack--vertical.sl-stack--8>:not(style)~:not(style){margin-top:32px}.sl-stack--horizontal.sl-stack--9>:not(style)~:not(style){margin-left:36px}.sl-stack--vertical.sl-stack--9>:not(style)~:not(style){margin-top:36px}.sl-stack--horizontal.sl-stack--10>:not(style)~:not(style){margin-left:40px}.sl-stack--vertical.sl-stack--10>:not(style)~:not(style){margin-top:40px}.sl-stack--horizontal.sl-stack--12>:not(style)~:not(style){margin-left:48px}.sl-stack--vertical.sl-stack--12>:not(style)~:not(style){margin-top:48px}.sl-stack--horizontal.sl-stack--14>:not(style)~:not(style){margin-left:56px}.sl-stack--vertical.sl-stack--14>:not(style)~:not(style){margin-top:56px}.sl-stack--horizontal.sl-stack--16>:not(style)~:not(style){margin-left:64px}.sl-stack--vertical.sl-stack--16>:not(style)~:not(style){margin-top:64px}.sl-stack--horizontal.sl-stack--20>:not(style)~:not(style){margin-left:80px}.sl-stack--vertical.sl-stack--20>:not(style)~:not(style){margin-top:80px}.sl-stack--horizontal.sl-stack--24>:not(style)~:not(style){margin-left:96px}.sl-stack--vertical.sl-stack--24>:not(style)~:not(style){margin-top:96px}.sl-stack--horizontal.sl-stack--32>:not(style)~:not(style){margin-left:128px}.sl-stack--vertical.sl-stack--32>:not(style)~:not(style){margin-top:128px}.sl-content-center{align-content:center}.sl-content-start{align-content:flex-start}.sl-content-end{align-content:flex-end}.sl-content-between{align-content:space-between}.sl-content-around{align-content:space-around}.sl-content-evenly{align-content:space-evenly}.sl-items-start{align-items:flex-start}.sl-items-end{align-items:flex-end}.sl-items-center{align-items:center}.sl-items-baseline{align-items:baseline}.sl-items-stretch{align-items:stretch}.sl-self-auto{align-self:auto}.sl-self-start{align-self:flex-start}.sl-self-end{align-self:flex-end}.sl-self-center{align-self:center}.sl-self-stretch{align-self:stretch}.sl-bg-transparent{background-color:transparent}.sl-bg-current{background-color:currentColor}.sl-bg-lighten-100{background-color:var(--color-lighten-100)}.sl-bg-darken-100{background-color:var(--color-darken-100)}.sl-bg-primary{background-color:var(--color-primary)}.sl-bg-primary-tint{background-color:var(--color-primary-tint)}.sl-bg-primary-light{background-color:var(--color-primary-light)}.sl-bg-primary-dark{background-color:var(--color-primary-dark)}.sl-bg-primary-darker{background-color:var(--color-primary-darker)}.sl-bg-success{background-color:var(--color-success)}.sl-bg-success-tint{background-color:var(--color-success-tint)}.sl-bg-success-light{background-color:var(--color-success-light)}.sl-bg-success-dark{background-color:var(--color-success-dark)}.sl-bg-success-darker{background-color:var(--color-success-darker)}.sl-bg-warning{background-color:var(--color-warning)}.sl-bg-warning-tint{background-color:var(--color-warning-tint)}.sl-bg-warning-light{background-color:var(--color-warning-light)}.sl-bg-warning-dark{background-color:var(--color-warning-dark)}.sl-bg-warning-darker{background-color:var(--color-warning-darker)}.sl-bg-danger{background-color:var(--color-danger)}.sl-bg-danger-tint{background-color:var(--color-danger-tint)}.sl-bg-danger-light{background-color:var(--color-danger-light)}.sl-bg-danger-dark{background-color:var(--color-danger-dark)}.sl-bg-danger-darker{background-color:var(--color-danger-darker)}.sl-bg-code{background-color:var(--color-code)}.sl-bg-on-code{background-color:var(--color-on-code)}.sl-bg-on-primary{background-color:var(--color-on-primary)}.sl-bg-on-success{background-color:var(--color-on-success)}.sl-bg-on-warning{background-color:var(--color-on-warning)}.sl-bg-on-danger{background-color:var(--color-on-danger)}.sl-bg-canvas-50{background-color:var(--color-canvas-50)}.sl-bg-canvas-100{background-color:var(--color-canvas-100)}.sl-bg-canvas-200{background-color:var(--color-canvas-200)}.sl-bg-canvas-300{background-color:var(--color-canvas-300)}.sl-bg-canvas-400{background-color:var(--color-canvas-400)}.sl-bg-canvas-500{background-color:var(--color-canvas-500)}.sl-bg-canvas-dark{background-color:var(--color-canvas-dark)}.sl-bg-canvas-pure{background-color:var(--color-canvas-pure)}.sl-bg-canvas{background-color:var(--color-canvas)}.sl-bg-canvas-tint{background-color:var(--color-canvas-tint)}.sl-bg-canvas-dialog{background-color:var(--color-canvas-dialog)}.sl-bg-body{background-color:var(--color-text)}.sl-bg-body-muted{background-color:var(--color-text-muted)}.sl-bg-body-light{background-color:var(--color-text-light)}.hover\:sl-bg-transparent:hover{background-color:transparent}.hover\:sl-bg-current:hover{background-color:currentColor}.hover\:sl-bg-lighten-100:hover{background-color:var(--color-lighten-100)}.hover\:sl-bg-darken-100:hover{background-color:var(--color-darken-100)}.hover\:sl-bg-primary:hover{background-color:var(--color-primary)}.hover\:sl-bg-primary-tint:hover{background-color:var(--color-primary-tint)}.hover\:sl-bg-primary-light:hover{background-color:var(--color-primary-light)}.hover\:sl-bg-primary-dark:hover{background-color:var(--color-primary-dark)}.hover\:sl-bg-primary-darker:hover{background-color:var(--color-primary-darker)}.hover\:sl-bg-success:hover{background-color:var(--color-success)}.hover\:sl-bg-success-tint:hover{background-color:var(--color-success-tint)}.hover\:sl-bg-success-light:hover{background-color:var(--color-success-light)}.hover\:sl-bg-success-dark:hover{background-color:var(--color-success-dark)}.hover\:sl-bg-success-darker:hover{background-color:var(--color-success-darker)}.hover\:sl-bg-warning:hover{background-color:var(--color-warning)}.hover\:sl-bg-warning-tint:hover{background-color:var(--color-warning-tint)}.hover\:sl-bg-warning-light:hover{background-color:var(--color-warning-light)}.hover\:sl-bg-warning-dark:hover{background-color:var(--color-warning-dark)}.hover\:sl-bg-warning-darker:hover{background-color:var(--color-warning-darker)}.hover\:sl-bg-danger:hover{background-color:var(--color-danger)}.hover\:sl-bg-danger-tint:hover{background-color:var(--color-danger-tint)}.hover\:sl-bg-danger-light:hover{background-color:var(--color-danger-light)}.hover\:sl-bg-danger-dark:hover{background-color:var(--color-danger-dark)}.hover\:sl-bg-danger-darker:hover{background-color:var(--color-danger-darker)}.hover\:sl-bg-code:hover{background-color:var(--color-code)}.hover\:sl-bg-on-code:hover{background-color:var(--color-on-code)}.hover\:sl-bg-on-primary:hover{background-color:var(--color-on-primary)}.hover\:sl-bg-on-success:hover{background-color:var(--color-on-success)}.hover\:sl-bg-on-warning:hover{background-color:var(--color-on-warning)}.hover\:sl-bg-on-danger:hover{background-color:var(--color-on-danger)}.hover\:sl-bg-canvas-50:hover{background-color:var(--color-canvas-50)}.hover\:sl-bg-canvas-100:hover{background-color:var(--color-canvas-100)}.hover\:sl-bg-canvas-200:hover{background-color:var(--color-canvas-200)}.hover\:sl-bg-canvas-300:hover{background-color:var(--color-canvas-300)}.hover\:sl-bg-canvas-400:hover{background-color:var(--color-canvas-400)}.hover\:sl-bg-canvas-500:hover{background-color:var(--color-canvas-500)}.hover\:sl-bg-canvas-dark:hover{background-color:var(--color-canvas-dark)}.hover\:sl-bg-canvas-pure:hover{background-color:var(--color-canvas-pure)}.hover\:sl-bg-canvas:hover{background-color:var(--color-canvas)}.hover\:sl-bg-canvas-tint:hover{background-color:var(--color-canvas-tint)}.hover\:sl-bg-canvas-dialog:hover{background-color:var(--color-canvas-dialog)}.hover\:sl-bg-body:hover{background-color:var(--color-text)}.hover\:sl-bg-body-muted:hover{background-color:var(--color-text-muted)}.hover\:sl-bg-body-light:hover{background-color:var(--color-text-light)}.focus\:sl-bg-transparent:focus{background-color:transparent}.focus\:sl-bg-current:focus{background-color:currentColor}.focus\:sl-bg-lighten-100:focus{background-color:var(--color-lighten-100)}.focus\:sl-bg-darken-100:focus{background-color:var(--color-darken-100)}.focus\:sl-bg-primary:focus{background-color:var(--color-primary)}.focus\:sl-bg-primary-tint:focus{background-color:var(--color-primary-tint)}.focus\:sl-bg-primary-light:focus{background-color:var(--color-primary-light)}.focus\:sl-bg-primary-dark:focus{background-color:var(--color-primary-dark)}.focus\:sl-bg-primary-darker:focus{background-color:var(--color-primary-darker)}.focus\:sl-bg-success:focus{background-color:var(--color-success)}.focus\:sl-bg-success-tint:focus{background-color:var(--color-success-tint)}.focus\:sl-bg-success-light:focus{background-color:var(--color-success-light)}.focus\:sl-bg-success-dark:focus{background-color:var(--color-success-dark)}.focus\:sl-bg-success-darker:focus{background-color:var(--color-success-darker)}.focus\:sl-bg-warning:focus{background-color:var(--color-warning)}.focus\:sl-bg-warning-tint:focus{background-color:var(--color-warning-tint)}.focus\:sl-bg-warning-light:focus{background-color:var(--color-warning-light)}.focus\:sl-bg-warning-dark:focus{background-color:var(--color-warning-dark)}.focus\:sl-bg-warning-darker:focus{background-color:var(--color-warning-darker)}.focus\:sl-bg-danger:focus{background-color:var(--color-danger)}.focus\:sl-bg-danger-tint:focus{background-color:var(--color-danger-tint)}.focus\:sl-bg-danger-light:focus{background-color:var(--color-danger-light)}.focus\:sl-bg-danger-dark:focus{background-color:var(--color-danger-dark)}.focus\:sl-bg-danger-darker:focus{background-color:var(--color-danger-darker)}.focus\:sl-bg-code:focus{background-color:var(--color-code)}.focus\:sl-bg-on-code:focus{background-color:var(--color-on-code)}.focus\:sl-bg-on-primary:focus{background-color:var(--color-on-primary)}.focus\:sl-bg-on-success:focus{background-color:var(--color-on-success)}.focus\:sl-bg-on-warning:focus{background-color:var(--color-on-warning)}.focus\:sl-bg-on-danger:focus{background-color:var(--color-on-danger)}.focus\:sl-bg-canvas-50:focus{background-color:var(--color-canvas-50)}.focus\:sl-bg-canvas-100:focus{background-color:var(--color-canvas-100)}.focus\:sl-bg-canvas-200:focus{background-color:var(--color-canvas-200)}.focus\:sl-bg-canvas-300:focus{background-color:var(--color-canvas-300)}.focus\:sl-bg-canvas-400:focus{background-color:var(--color-canvas-400)}.focus\:sl-bg-canvas-500:focus{background-color:var(--color-canvas-500)}.focus\:sl-bg-canvas-dark:focus{background-color:var(--color-canvas-dark)}.focus\:sl-bg-canvas-pure:focus{background-color:var(--color-canvas-pure)}.focus\:sl-bg-canvas:focus{background-color:var(--color-canvas)}.focus\:sl-bg-canvas-tint:focus{background-color:var(--color-canvas-tint)}.focus\:sl-bg-canvas-dialog:focus{background-color:var(--color-canvas-dialog)}.focus\:sl-bg-body:focus{background-color:var(--color-text)}.focus\:sl-bg-body-muted:focus{background-color:var(--color-text-muted)}.focus\:sl-bg-body-light:focus{background-color:var(--color-text-light)}.active\:sl-bg-transparent:active{background-color:transparent}.active\:sl-bg-current:active{background-color:currentColor}.active\:sl-bg-lighten-100:active{background-color:var(--color-lighten-100)}.active\:sl-bg-darken-100:active{background-color:var(--color-darken-100)}.active\:sl-bg-primary:active{background-color:var(--color-primary)}.active\:sl-bg-primary-tint:active{background-color:var(--color-primary-tint)}.active\:sl-bg-primary-light:active{background-color:var(--color-primary-light)}.active\:sl-bg-primary-dark:active{background-color:var(--color-primary-dark)}.active\:sl-bg-primary-darker:active{background-color:var(--color-primary-darker)}.active\:sl-bg-success:active{background-color:var(--color-success)}.active\:sl-bg-success-tint:active{background-color:var(--color-success-tint)}.active\:sl-bg-success-light:active{background-color:var(--color-success-light)}.active\:sl-bg-success-dark:active{background-color:var(--color-success-dark)}.active\:sl-bg-success-darker:active{background-color:var(--color-success-darker)}.active\:sl-bg-warning:active{background-color:var(--color-warning)}.active\:sl-bg-warning-tint:active{background-color:var(--color-warning-tint)}.active\:sl-bg-warning-light:active{background-color:var(--color-warning-light)}.active\:sl-bg-warning-dark:active{background-color:var(--color-warning-dark)}.active\:sl-bg-warning-darker:active{background-color:var(--color-warning-darker)}.active\:sl-bg-danger:active{background-color:var(--color-danger)}.active\:sl-bg-danger-tint:active{background-color:var(--color-danger-tint)}.active\:sl-bg-danger-light:active{background-color:var(--color-danger-light)}.active\:sl-bg-danger-dark:active{background-color:var(--color-danger-dark)}.active\:sl-bg-danger-darker:active{background-color:var(--color-danger-darker)}.active\:sl-bg-code:active{background-color:var(--color-code)}.active\:sl-bg-on-code:active{background-color:var(--color-on-code)}.active\:sl-bg-on-primary:active{background-color:var(--color-on-primary)}.active\:sl-bg-on-success:active{background-color:var(--color-on-success)}.active\:sl-bg-on-warning:active{background-color:var(--color-on-warning)}.active\:sl-bg-on-danger:active{background-color:var(--color-on-danger)}.active\:sl-bg-canvas-50:active{background-color:var(--color-canvas-50)}.active\:sl-bg-canvas-100:active{background-color:var(--color-canvas-100)}.active\:sl-bg-canvas-200:active{background-color:var(--color-canvas-200)}.active\:sl-bg-canvas-300:active{background-color:var(--color-canvas-300)}.active\:sl-bg-canvas-400:active{background-color:var(--color-canvas-400)}.active\:sl-bg-canvas-500:active{background-color:var(--color-canvas-500)}.active\:sl-bg-canvas-dark:active{background-color:var(--color-canvas-dark)}.active\:sl-bg-canvas-pure:active{background-color:var(--color-canvas-pure)}.active\:sl-bg-canvas:active{background-color:var(--color-canvas)}.active\:sl-bg-canvas-tint:active{background-color:var(--color-canvas-tint)}.active\:sl-bg-canvas-dialog:active{background-color:var(--color-canvas-dialog)}.active\:sl-bg-body:active{background-color:var(--color-text)}.active\:sl-bg-body-muted:active{background-color:var(--color-text-muted)}.active\:sl-bg-body-light:active{background-color:var(--color-text-light)}.disabled\:sl-bg-transparent:disabled{background-color:transparent}.disabled\:sl-bg-current:disabled{background-color:currentColor}.disabled\:sl-bg-lighten-100:disabled{background-color:var(--color-lighten-100)}.disabled\:sl-bg-darken-100:disabled{background-color:var(--color-darken-100)}.disabled\:sl-bg-primary:disabled{background-color:var(--color-primary)}.disabled\:sl-bg-primary-tint:disabled{background-color:var(--color-primary-tint)}.disabled\:sl-bg-primary-light:disabled{background-color:var(--color-primary-light)}.disabled\:sl-bg-primary-dark:disabled{background-color:var(--color-primary-dark)}.disabled\:sl-bg-primary-darker:disabled{background-color:var(--color-primary-darker)}.disabled\:sl-bg-success:disabled{background-color:var(--color-success)}.disabled\:sl-bg-success-tint:disabled{background-color:var(--color-success-tint)}.disabled\:sl-bg-success-light:disabled{background-color:var(--color-success-light)}.disabled\:sl-bg-success-dark:disabled{background-color:var(--color-success-dark)}.disabled\:sl-bg-success-darker:disabled{background-color:var(--color-success-darker)}.disabled\:sl-bg-warning:disabled{background-color:var(--color-warning)}.disabled\:sl-bg-warning-tint:disabled{background-color:var(--color-warning-tint)}.disabled\:sl-bg-warning-light:disabled{background-color:var(--color-warning-light)}.disabled\:sl-bg-warning-dark:disabled{background-color:var(--color-warning-dark)}.disabled\:sl-bg-warning-darker:disabled{background-color:var(--color-warning-darker)}.disabled\:sl-bg-danger:disabled{background-color:var(--color-danger)}.disabled\:sl-bg-danger-tint:disabled{background-color:var(--color-danger-tint)}.disabled\:sl-bg-danger-light:disabled{background-color:var(--color-danger-light)}.disabled\:sl-bg-danger-dark:disabled{background-color:var(--color-danger-dark)}.disabled\:sl-bg-danger-darker:disabled{background-color:var(--color-danger-darker)}.disabled\:sl-bg-code:disabled{background-color:var(--color-code)}.disabled\:sl-bg-on-code:disabled{background-color:var(--color-on-code)}.disabled\:sl-bg-on-primary:disabled{background-color:var(--color-on-primary)}.disabled\:sl-bg-on-success:disabled{background-color:var(--color-on-success)}.disabled\:sl-bg-on-warning:disabled{background-color:var(--color-on-warning)}.disabled\:sl-bg-on-danger:disabled{background-color:var(--color-on-danger)}.disabled\:sl-bg-canvas-50:disabled{background-color:var(--color-canvas-50)}.disabled\:sl-bg-canvas-100:disabled{background-color:var(--color-canvas-100)}.disabled\:sl-bg-canvas-200:disabled{background-color:var(--color-canvas-200)}.disabled\:sl-bg-canvas-300:disabled{background-color:var(--color-canvas-300)}.disabled\:sl-bg-canvas-400:disabled{background-color:var(--color-canvas-400)}.disabled\:sl-bg-canvas-500:disabled{background-color:var(--color-canvas-500)}.disabled\:sl-bg-canvas-dark:disabled{background-color:var(--color-canvas-dark)}.disabled\:sl-bg-canvas-pure:disabled{background-color:var(--color-canvas-pure)}.disabled\:sl-bg-canvas:disabled{background-color:var(--color-canvas)}.disabled\:sl-bg-canvas-tint:disabled{background-color:var(--color-canvas-tint)}.disabled\:sl-bg-canvas-dialog:disabled{background-color:var(--color-canvas-dialog)}.disabled\:sl-bg-body:disabled{background-color:var(--color-text)}.disabled\:sl-bg-body-muted:disabled{background-color:var(--color-text-muted)}.disabled\:sl-bg-body-light:disabled{background-color:var(--color-text-light)}.sl-bg-none{background-image:none}.sl-bg-gradient-to-t{background-image:linear-gradient(to top,var(--tw-gradient-stops))}.sl-bg-gradient-to-tr{background-image:linear-gradient(to top right,var(--tw-gradient-stops))}.sl-bg-gradient-to-r{background-image:linear-gradient(to right,var(--tw-gradient-stops))}.sl-bg-gradient-to-br{background-image:linear-gradient(to bottom right,var(--tw-gradient-stops))}.sl-bg-gradient-to-b{background-image:linear-gradient(to bottom,var(--tw-gradient-stops))}.sl-bg-gradient-to-bl{background-image:linear-gradient(to bottom left,var(--tw-gradient-stops))}.sl-bg-gradient-to-l{background-image:linear-gradient(to left,var(--tw-gradient-stops))}.sl-bg-gradient-to-tl{background-image:linear-gradient(to top left,var(--tw-gradient-stops))}.sl-blur-0,.sl-blur-none{--tw-blur:blur(0)}.sl-blur-sm{--tw-blur:blur(4px)}.sl-blur{--tw-blur:blur(8px)}.sl-blur-md{--tw-blur:blur(12px)}.sl-blur-lg{--tw-blur:blur(16px)}.sl-blur-xl{--tw-blur:blur(24px)}.sl-blur-2xl{--tw-blur:blur(40px)}.sl-blur-3xl{--tw-blur:blur(64px)}.sl-border-transparent{border-color:transparent}.sl-border-current{border-color:currentColor}.sl-border-lighten-100{border-color:var(--color-lighten-100)}.sl-border-darken-100{border-color:var(--color-darken-100)}.sl-border-primary{border-color:var(--color-primary)}.sl-border-primary-tint{border-color:var(--color-primary-tint)}.sl-border-primary-light{border-color:var(--color-primary-light)}.sl-border-primary-dark{border-color:var(--color-primary-dark)}.sl-border-primary-darker{border-color:var(--color-primary-darker)}.sl-border-success{border-color:var(--color-success)}.sl-border-success-tint{border-color:var(--color-success-tint)}.sl-border-success-light{border-color:var(--color-success-light)}.sl-border-success-dark{border-color:var(--color-success-dark)}.sl-border-success-darker{border-color:var(--color-success-darker)}.sl-border-warning{border-color:var(--color-warning)}.sl-border-warning-tint{border-color:var(--color-warning-tint)}.sl-border-warning-light{border-color:var(--color-warning-light)}.sl-border-warning-dark{border-color:var(--color-warning-dark)}.sl-border-warning-darker{border-color:var(--color-warning-darker)}.sl-border-danger{border-color:var(--color-danger)}.sl-border-danger-tint{border-color:var(--color-danger-tint)}.sl-border-danger-light{border-color:var(--color-danger-light)}.sl-border-danger-dark{border-color:var(--color-danger-dark)}.sl-border-danger-darker{border-color:var(--color-danger-darker)}.sl-border-code{border-color:var(--color-code)}.sl-border-on-code{border-color:var(--color-on-code)}.sl-border-on-primary{border-color:var(--color-on-primary)}.sl-border-on-success{border-color:var(--color-on-success)}.sl-border-on-warning{border-color:var(--color-on-warning)}.sl-border-on-danger{border-color:var(--color-on-danger)}.sl-border-light{border-color:var(--color-border-light)}.sl-border-dark{border-color:var(--color-border-dark)}.sl-border-button{border-color:var(--color-border-button)}.sl-border-input{border-color:var(--color-border-input)}.sl-border-body{border-color:var(--color-text)}.hover\:sl-border-transparent:hover{border-color:transparent}.hover\:sl-border-current:hover{border-color:currentColor}.hover\:sl-border-lighten-100:hover{border-color:var(--color-lighten-100)}.hover\:sl-border-darken-100:hover{border-color:var(--color-darken-100)}.hover\:sl-border-primary:hover{border-color:var(--color-primary)}.hover\:sl-border-primary-tint:hover{border-color:var(--color-primary-tint)}.hover\:sl-border-primary-light:hover{border-color:var(--color-primary-light)}.hover\:sl-border-primary-dark:hover{border-color:var(--color-primary-dark)}.hover\:sl-border-primary-darker:hover{border-color:var(--color-primary-darker)}.hover\:sl-border-success:hover{border-color:var(--color-success)}.hover\:sl-border-success-tint:hover{border-color:var(--color-success-tint)}.hover\:sl-border-success-light:hover{border-color:var(--color-success-light)}.hover\:sl-border-success-dark:hover{border-color:var(--color-success-dark)}.hover\:sl-border-success-darker:hover{border-color:var(--color-success-darker)}.hover\:sl-border-warning:hover{border-color:var(--color-warning)}.hover\:sl-border-warning-tint:hover{border-color:var(--color-warning-tint)}.hover\:sl-border-warning-light:hover{border-color:var(--color-warning-light)}.hover\:sl-border-warning-dark:hover{border-color:var(--color-warning-dark)}.hover\:sl-border-warning-darker:hover{border-color:var(--color-warning-darker)}.hover\:sl-border-danger:hover{border-color:var(--color-danger)}.hover\:sl-border-danger-tint:hover{border-color:var(--color-danger-tint)}.hover\:sl-border-danger-light:hover{border-color:var(--color-danger-light)}.hover\:sl-border-danger-dark:hover{border-color:var(--color-danger-dark)}.hover\:sl-border-danger-darker:hover{border-color:var(--color-danger-darker)}.hover\:sl-border-code:hover{border-color:var(--color-code)}.hover\:sl-border-on-code:hover{border-color:var(--color-on-code)}.hover\:sl-border-on-primary:hover{border-color:var(--color-on-primary)}.hover\:sl-border-on-success:hover{border-color:var(--color-on-success)}.hover\:sl-border-on-warning:hover{border-color:var(--color-on-warning)}.hover\:sl-border-on-danger:hover{border-color:var(--color-on-danger)}.hover\:sl-border-light:hover{border-color:var(--color-border-light)}.hover\:sl-border-dark:hover{border-color:var(--color-border-dark)}.hover\:sl-border-button:hover{border-color:var(--color-border-button)}.hover\:sl-border-input:hover{border-color:var(--color-border-input)}.hover\:sl-border-body:hover{border-color:var(--color-text)}.focus\:sl-border-transparent:focus{border-color:transparent}.focus\:sl-border-current:focus{border-color:currentColor}.focus\:sl-border-lighten-100:focus{border-color:var(--color-lighten-100)}.focus\:sl-border-darken-100:focus{border-color:var(--color-darken-100)}.focus\:sl-border-primary:focus{border-color:var(--color-primary)}.focus\:sl-border-primary-tint:focus{border-color:var(--color-primary-tint)}.focus\:sl-border-primary-light:focus{border-color:var(--color-primary-light)}.focus\:sl-border-primary-dark:focus{border-color:var(--color-primary-dark)}.focus\:sl-border-primary-darker:focus{border-color:var(--color-primary-darker)}.focus\:sl-border-success:focus{border-color:var(--color-success)}.focus\:sl-border-success-tint:focus{border-color:var(--color-success-tint)}.focus\:sl-border-success-light:focus{border-color:var(--color-success-light)}.focus\:sl-border-success-dark:focus{border-color:var(--color-success-dark)}.focus\:sl-border-success-darker:focus{border-color:var(--color-success-darker)}.focus\:sl-border-warning:focus{border-color:var(--color-warning)}.focus\:sl-border-warning-tint:focus{border-color:var(--color-warning-tint)}.focus\:sl-border-warning-light:focus{border-color:var(--color-warning-light)}.focus\:sl-border-warning-dark:focus{border-color:var(--color-warning-dark)}.focus\:sl-border-warning-darker:focus{border-color:var(--color-warning-darker)}.focus\:sl-border-danger:focus{border-color:var(--color-danger)}.focus\:sl-border-danger-tint:focus{border-color:var(--color-danger-tint)}.focus\:sl-border-danger-light:focus{border-color:var(--color-danger-light)}.focus\:sl-border-danger-dark:focus{border-color:var(--color-danger-dark)}.focus\:sl-border-danger-darker:focus{border-color:var(--color-danger-darker)}.focus\:sl-border-code:focus{border-color:var(--color-code)}.focus\:sl-border-on-code:focus{border-color:var(--color-on-code)}.focus\:sl-border-on-primary:focus{border-color:var(--color-on-primary)}.focus\:sl-border-on-success:focus{border-color:var(--color-on-success)}.focus\:sl-border-on-warning:focus{border-color:var(--color-on-warning)}.focus\:sl-border-on-danger:focus{border-color:var(--color-on-danger)}.focus\:sl-border-light:focus{border-color:var(--color-border-light)}.focus\:sl-border-dark:focus{border-color:var(--color-border-dark)}.focus\:sl-border-button:focus{border-color:var(--color-border-button)}.focus\:sl-border-input:focus{border-color:var(--color-border-input)}.focus\:sl-border-body:focus{border-color:var(--color-text)}.focus-within\:sl-border-transparent:focus-within{border-color:transparent}.focus-within\:sl-border-current:focus-within{border-color:currentColor}.focus-within\:sl-border-lighten-100:focus-within{border-color:var(--color-lighten-100)}.focus-within\:sl-border-darken-100:focus-within{border-color:var(--color-darken-100)}.focus-within\:sl-border-primary:focus-within{border-color:var(--color-primary)}.focus-within\:sl-border-primary-tint:focus-within{border-color:var(--color-primary-tint)}.focus-within\:sl-border-primary-light:focus-within{border-color:var(--color-primary-light)}.focus-within\:sl-border-primary-dark:focus-within{border-color:var(--color-primary-dark)}.focus-within\:sl-border-primary-darker:focus-within{border-color:var(--color-primary-darker)}.focus-within\:sl-border-success:focus-within{border-color:var(--color-success)}.focus-within\:sl-border-success-tint:focus-within{border-color:var(--color-success-tint)}.focus-within\:sl-border-success-light:focus-within{border-color:var(--color-success-light)}.focus-within\:sl-border-success-dark:focus-within{border-color:var(--color-success-dark)}.focus-within\:sl-border-success-darker:focus-within{border-color:var(--color-success-darker)}.focus-within\:sl-border-warning:focus-within{border-color:var(--color-warning)}.focus-within\:sl-border-warning-tint:focus-within{border-color:var(--color-warning-tint)}.focus-within\:sl-border-warning-light:focus-within{border-color:var(--color-warning-light)}.focus-within\:sl-border-warning-dark:focus-within{border-color:var(--color-warning-dark)}.focus-within\:sl-border-warning-darker:focus-within{border-color:var(--color-warning-darker)}.focus-within\:sl-border-danger:focus-within{border-color:var(--color-danger)}.focus-within\:sl-border-danger-tint:focus-within{border-color:var(--color-danger-tint)}.focus-within\:sl-border-danger-light:focus-within{border-color:var(--color-danger-light)}.focus-within\:sl-border-danger-dark:focus-within{border-color:var(--color-danger-dark)}.focus-within\:sl-border-danger-darker:focus-within{border-color:var(--color-danger-darker)}.focus-within\:sl-border-code:focus-within{border-color:var(--color-code)}.focus-within\:sl-border-on-code:focus-within{border-color:var(--color-on-code)}.focus-within\:sl-border-on-primary:focus-within{border-color:var(--color-on-primary)}.focus-within\:sl-border-on-success:focus-within{border-color:var(--color-on-success)}.focus-within\:sl-border-on-warning:focus-within{border-color:var(--color-on-warning)}.focus-within\:sl-border-on-danger:focus-within{border-color:var(--color-on-danger)}.focus-within\:sl-border-light:focus-within{border-color:var(--color-border-light)}.focus-within\:sl-border-dark:focus-within{border-color:var(--color-border-dark)}.focus-within\:sl-border-button:focus-within{border-color:var(--color-border-button)}.focus-within\:sl-border-input:focus-within{border-color:var(--color-border-input)}.focus-within\:sl-border-body:focus-within{border-color:var(--color-text)}.active\:sl-border-transparent:active{border-color:transparent}.active\:sl-border-current:active{border-color:currentColor}.active\:sl-border-lighten-100:active{border-color:var(--color-lighten-100)}.active\:sl-border-darken-100:active{border-color:var(--color-darken-100)}.active\:sl-border-primary:active{border-color:var(--color-primary)}.active\:sl-border-primary-tint:active{border-color:var(--color-primary-tint)}.active\:sl-border-primary-light:active{border-color:var(--color-primary-light)}.active\:sl-border-primary-dark:active{border-color:var(--color-primary-dark)}.active\:sl-border-primary-darker:active{border-color:var(--color-primary-darker)}.active\:sl-border-success:active{border-color:var(--color-success)}.active\:sl-border-success-tint:active{border-color:var(--color-success-tint)}.active\:sl-border-success-light:active{border-color:var(--color-success-light)}.active\:sl-border-success-dark:active{border-color:var(--color-success-dark)}.active\:sl-border-success-darker:active{border-color:var(--color-success-darker)}.active\:sl-border-warning:active{border-color:var(--color-warning)}.active\:sl-border-warning-tint:active{border-color:var(--color-warning-tint)}.active\:sl-border-warning-light:active{border-color:var(--color-warning-light)}.active\:sl-border-warning-dark:active{border-color:var(--color-warning-dark)}.active\:sl-border-warning-darker:active{border-color:var(--color-warning-darker)}.active\:sl-border-danger:active{border-color:var(--color-danger)}.active\:sl-border-danger-tint:active{border-color:var(--color-danger-tint)}.active\:sl-border-danger-light:active{border-color:var(--color-danger-light)}.active\:sl-border-danger-dark:active{border-color:var(--color-danger-dark)}.active\:sl-border-danger-darker:active{border-color:var(--color-danger-darker)}.active\:sl-border-code:active{border-color:var(--color-code)}.active\:sl-border-on-code:active{border-color:var(--color-on-code)}.active\:sl-border-on-primary:active{border-color:var(--color-on-primary)}.active\:sl-border-on-success:active{border-color:var(--color-on-success)}.active\:sl-border-on-warning:active{border-color:var(--color-on-warning)}.active\:sl-border-on-danger:active{border-color:var(--color-on-danger)}.active\:sl-border-light:active{border-color:var(--color-border-light)}.active\:sl-border-dark:active{border-color:var(--color-border-dark)}.active\:sl-border-button:active{border-color:var(--color-border-button)}.active\:sl-border-input:active{border-color:var(--color-border-input)}.active\:sl-border-body:active{border-color:var(--color-text)}.sl-rounded-none{border-radius:0}.sl-rounded-sm{border-radius:1px}.sl-rounded{border-radius:2px}.sl-rounded-lg{border-radius:5px}.sl-rounded-xl{border-radius:7px}.sl-rounded-full{border-radius:9999px}.sl-rounded-t-none{border-top-left-radius:0;border-top-right-radius:0}.sl-rounded-r-none{border-bottom-right-radius:0;border-top-right-radius:0}.sl-rounded-b-none{border-bottom-left-radius:0;border-bottom-right-radius:0}.sl-rounded-l-none{border-bottom-left-radius:0;border-top-left-radius:0}.sl-rounded-t-sm{border-top-left-radius:1px;border-top-right-radius:1px}.sl-rounded-r-sm{border-bottom-right-radius:1px;border-top-right-radius:1px}.sl-rounded-b-sm{border-bottom-left-radius:1px;border-bottom-right-radius:1px}.sl-rounded-l-sm{border-bottom-left-radius:1px;border-top-left-radius:1px}.sl-rounded-t{border-top-left-radius:2px}.sl-rounded-r,.sl-rounded-t{border-top-right-radius:2px}.sl-rounded-b,.sl-rounded-r{border-bottom-right-radius:2px}.sl-rounded-b,.sl-rounded-l{border-bottom-left-radius:2px}.sl-rounded-l{border-top-left-radius:2px}.sl-rounded-t-lg{border-top-left-radius:5px;border-top-right-radius:5px}.sl-rounded-r-lg{border-bottom-right-radius:5px;border-top-right-radius:5px}.sl-rounded-b-lg{border-bottom-left-radius:5px;border-bottom-right-radius:5px}.sl-rounded-l-lg{border-bottom-left-radius:5px;border-top-left-radius:5px}.sl-rounded-t-xl{border-top-left-radius:7px;border-top-right-radius:7px}.sl-rounded-r-xl{border-bottom-right-radius:7px;border-top-right-radius:7px}.sl-rounded-b-xl{border-bottom-left-radius:7px;border-bottom-right-radius:7px}.sl-rounded-l-xl{border-bottom-left-radius:7px;border-top-left-radius:7px}.sl-rounded-t-full{border-top-left-radius:9999px;border-top-right-radius:9999px}.sl-rounded-r-full{border-bottom-right-radius:9999px;border-top-right-radius:9999px}.sl-rounded-b-full{border-bottom-left-radius:9999px;border-bottom-right-radius:9999px}.sl-rounded-l-full{border-bottom-left-radius:9999px;border-top-left-radius:9999px}.sl-rounded-tl-none{border-top-left-radius:0}.sl-rounded-tr-none{border-top-right-radius:0}.sl-rounded-br-none{border-bottom-right-radius:0}.sl-rounded-bl-none{border-bottom-left-radius:0}.sl-rounded-tl-sm{border-top-left-radius:1px}.sl-rounded-tr-sm{border-top-right-radius:1px}.sl-rounded-br-sm{border-bottom-right-radius:1px}.sl-rounded-bl-sm{border-bottom-left-radius:1px}.sl-rounded-tl{border-top-left-radius:2px}.sl-rounded-tr{border-top-right-radius:2px}.sl-rounded-br{border-bottom-right-radius:2px}.sl-rounded-bl{border-bottom-left-radius:2px}.sl-rounded-tl-lg{border-top-left-radius:5px}.sl-rounded-tr-lg{border-top-right-radius:5px}.sl-rounded-br-lg{border-bottom-right-radius:5px}.sl-rounded-bl-lg{border-bottom-left-radius:5px}.sl-rounded-tl-xl{border-top-left-radius:7px}.sl-rounded-tr-xl{border-top-right-radius:7px}.sl-rounded-br-xl{border-bottom-right-radius:7px}.sl-rounded-bl-xl{border-bottom-left-radius:7px}.sl-rounded-tl-full{border-top-left-radius:9999px}.sl-rounded-tr-full{border-top-right-radius:9999px}.sl-rounded-br-full{border-bottom-right-radius:9999px}.sl-rounded-bl-full{border-bottom-left-radius:9999px}.sl-border-solid{border-style:solid}.sl-border-dashed{border-style:dashed}.sl-border-dotted{border-style:dotted}.sl-border-double{border-style:double}.sl-border-none{border-style:none}.sl-border-0{border-width:0}.sl-border-2{border-width:2px}.sl-border-4{border-width:4px}.sl-border-8{border-width:8px}.sl-border{border-width:1px}.sl-border-t-0{border-top-width:0}.sl-border-r-0{border-right-width:0}.sl-border-b-0{border-bottom-width:0}.sl-border-l-0{border-left-width:0}.sl-border-t-2{border-top-width:2px}.sl-border-r-2{border-right-width:2px}.sl-border-b-2{border-bottom-width:2px}.sl-border-l-2{border-left-width:2px}.sl-border-t-4{border-top-width:4px}.sl-border-r-4{border-right-width:4px}.sl-border-b-4{border-bottom-width:4px}.sl-border-l-4{border-left-width:4px}.sl-border-t-8{border-top-width:8px}.sl-border-r-8{border-right-width:8px}.sl-border-b-8{border-bottom-width:8px}.sl-border-l-8{border-left-width:8px}.sl-border-t{border-top-width:1px}.sl-border-r{border-right-width:1px}.sl-border-b{border-bottom-width:1px}.sl-border-l{border-left-width:1px}*{--tw-shadow:0 0 #0000}.sl-shadow-sm{--tw-shadow:var(--shadow-sm);box-shadow:var(--tw-ring-offset-shadow,0 0 #0000),var(--tw-ring-shadow,0 0 #0000),var(--tw-shadow)}.sl-shadow,.sl-shadow-md{--tw-shadow:var(--shadow-md)}.sl-shadow,.sl-shadow-lg,.sl-shadow-md{box-shadow:var(--tw-ring-offset-shadow,0 0 #0000),var(--tw-ring-shadow,0 0 #0000),var(--tw-shadow)}.sl-shadow-lg{--tw-shadow:var(--shadow-lg)}.sl-shadow-xl{--tw-shadow:var(--shadow-xl)}.sl-shadow-2xl,.sl-shadow-xl{box-shadow:var(--tw-ring-offset-shadow,0 0 #0000),var(--tw-ring-shadow,0 0 #0000),var(--tw-shadow)}.sl-shadow-2xl{--tw-shadow:var(--shadow-2xl)}.hover\:sl-shadow-sm:hover{--tw-shadow:var(--shadow-sm);box-shadow:var(--tw-ring-offset-shadow,0 0 #0000),var(--tw-ring-shadow,0 0 #0000),var(--tw-shadow)}.hover\:sl-shadow-md:hover,.hover\:sl-shadow:hover{--tw-shadow:var(--shadow-md)}.hover\:sl-shadow-lg:hover,.hover\:sl-shadow-md:hover,.hover\:sl-shadow:hover{box-shadow:var(--tw-ring-offset-shadow,0 0 #0000),var(--tw-ring-shadow,0 0 #0000),var(--tw-shadow)}.hover\:sl-shadow-lg:hover{--tw-shadow:var(--shadow-lg)}.hover\:sl-shadow-xl:hover{--tw-shadow:var(--shadow-xl)}.hover\:sl-shadow-2xl:hover,.hover\:sl-shadow-xl:hover{box-shadow:var(--tw-ring-offset-shadow,0 0 #0000),var(--tw-ring-shadow,0 0 #0000),var(--tw-shadow)}.hover\:sl-shadow-2xl:hover{--tw-shadow:var(--shadow-2xl)}.focus\:sl-shadow-sm:focus{--tw-shadow:var(--shadow-sm);box-shadow:var(--tw-ring-offset-shadow,0 0 #0000),var(--tw-ring-shadow,0 0 #0000),var(--tw-shadow)}.focus\:sl-shadow-md:focus,.focus\:sl-shadow:focus{--tw-shadow:var(--shadow-md)}.focus\:sl-shadow-lg:focus,.focus\:sl-shadow-md:focus,.focus\:sl-shadow:focus{box-shadow:var(--tw-ring-offset-shadow,0 0 #0000),var(--tw-ring-shadow,0 0 #0000),var(--tw-shadow)}.focus\:sl-shadow-lg:focus{--tw-shadow:var(--shadow-lg)}.focus\:sl-shadow-xl:focus{--tw-shadow:var(--shadow-xl)}.focus\:sl-shadow-2xl:focus,.focus\:sl-shadow-xl:focus{box-shadow:var(--tw-ring-offset-shadow,0 0 #0000),var(--tw-ring-shadow,0 0 #0000),var(--tw-shadow)}.focus\:sl-shadow-2xl:focus{--tw-shadow:var(--shadow-2xl)}.sl-box-border{box-sizing:border-box}.sl-box-content{box-sizing:content-box}.sl-cursor-auto{cursor:auto}.sl-cursor{cursor:default}.sl-cursor-pointer{cursor:pointer}.sl-cursor-wait{cursor:wait}.sl-cursor-text{cursor:text}.sl-cursor-move{cursor:move}.sl-cursor-not-allowed{cursor:not-allowed}.sl-cursor-zoom-in{cursor:zoom-in}.sl-cursor-zoom-out{cursor:zoom-out}.sl-block{display:block}.sl-inline-block{display:inline-block}.sl-inline{display:inline}.sl-flex{display:flex}.sl-inline-flex{display:inline-flex}.sl-table{display:table}.sl-inline-table{display:inline-table}.sl-table-caption{display:table-caption}.sl-table-cell{display:table-cell}.sl-table-column{display:table-column}.sl-table-column-group{display:table-column-group}.sl-table-footer-group{display:table-footer-group}.sl-table-header-group{display:table-header-group}.sl-table-row-group{display:table-row-group}.sl-table-row{display:table-row}.sl-flow-root{display:flow-root}.sl-grid{display:grid}.sl-inline-grid{display:inline-grid}.sl-contents{display:contents}.sl-list-item{display:list-item}.sl-hidden{display:none}.sl-drop-shadow{--tw-drop-shadow:drop-shadow(var(--drop-shadow-default1)) drop-shadow(var(--drop-shadow-default2))}.sl-filter{--tw-blur:var(--tw-empty,/*!*/ /*!*/);--tw-brightness:var(--tw-empty,/*!*/ /*!*/);--tw-contrast:var(--tw-empty,/*!*/ /*!*/);--tw-grayscale:var(--tw-empty,/*!*/ /*!*/);--tw-hue-rotate:var(--tw-empty,/*!*/ /*!*/);--tw-invert:var(--tw-empty,/*!*/ /*!*/);--tw-saturate:var(--tw-empty,/*!*/ /*!*/);--tw-sepia:var(--tw-empty,/*!*/ /*!*/);--tw-drop-shadow:var(--tw-empty,/*!*/ /*!*/);filter:var(--tw-blur) var(--tw-brightness) var(--tw-contrast) var(--tw-grayscale) var(--tw-hue-rotate) var(--tw-invert) var(--tw-saturate) var(--tw-sepia) var(--tw-drop-shadow)}.sl-filter-none{filter:none}.sl-flex-1{flex:1 1}.sl-flex-auto{flex:1 1 auto}.sl-flex-initial{flex:0 1 auto}.sl-flex-none{flex:none}.sl-flex-row{flex-direction:row}.sl-flex-row-reverse{flex-direction:row-reverse}.sl-flex-col{flex-direction:column}.sl-flex-col-reverse{flex-direction:column-reverse}.sl-flex-grow-0{flex-grow:0}.sl-flex-grow{flex-grow:1}.sl-flex-shrink-0{flex-shrink:0}.sl-flex-shrink{flex-shrink:1}.sl-flex-wrap{flex-wrap:wrap}.sl-flex-wrap-reverse{flex-wrap:wrap-reverse}.sl-flex-nowrap{flex-wrap:nowrap}.sl-font-sans,.sl-font-ui{font-family:var(--font-ui)}.sl-font-prose{font-family:var(--font-prose)}.sl-font-mono{font-family:var(--font-mono)}.sl-text-2xs{font-size:9px}.sl-text-xs{font-size:10px}.sl-text-sm{font-size:11px}.sl-text-base{font-size:12px}.sl-text-lg{font-size:14px}.sl-text-xl{font-size:16px}.sl-text-2xl{font-size:20px}.sl-text-3xl{font-size:24px}.sl-text-4xl{font-size:28px}.sl-text-5xl{font-size:36px}.sl-text-6xl{font-size:44px}.sl-text-paragraph-leading{font-size:var(--fs-paragraph-leading)}.sl-text-paragraph{font-size:var(--fs-paragraph)}.sl-text-paragraph-small{font-size:var(--fs-paragraph-small)}.sl-text-paragraph-tiny{font-size:var(--fs-paragraph-tiny)}.sl-antialiased{-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale}.sl-subpixel-antialiased{-webkit-font-smoothing:auto;-moz-osx-font-smoothing:auto}.sl-italic{font-style:italic}.sl-not-italic{font-style:normal}.sl-font-light{font-weight:300}.sl-font-normal{font-weight:400}.sl-font-medium{font-weight:500}.sl-font-semibold{font-weight:600}.sl-font-bold{font-weight:700}.sl-h-0{height:0}.sl-h-1{height:4px}.sl-h-2{height:8px}.sl-h-3{height:12px}.sl-h-4{height:16px}.sl-h-5{height:20px}.sl-h-6{height:24px}.sl-h-7{height:28px}.sl-h-8{height:32px}.sl-h-9{height:36px}.sl-h-10{height:40px}.sl-h-11{height:44px}.sl-h-12{height:48px}.sl-h-14{height:56px}.sl-h-16{height:64px}.sl-h-20{height:80px}.sl-h-24{height:96px}.sl-h-28{height:112px}.sl-h-32{height:128px}.sl-h-36{height:144px}.sl-h-40{height:160px}.sl-h-44{height:176px}.sl-h-48{height:192px}.sl-h-52{height:208px}.sl-h-56{height:224px}.sl-h-60{height:240px}.sl-h-64{height:256px}.sl-h-72{height:288px}.sl-h-80{height:320px}.sl-h-96{height:384px}.sl-h-auto{height:auto}.sl-h-px{height:1px}.sl-h-0\.5{height:2px}.sl-h-1\.5{height:6px}.sl-h-2\.5{height:10px}.sl-h-3\.5{height:14px}.sl-h-4\.5{height:18px}.sl-h-xs{height:20px}.sl-h-sm{height:24px}.sl-h-md{height:32px}.sl-h-lg{height:36px}.sl-h-xl{height:44px}.sl-h-2xl{height:52px}.sl-h-3xl{height:60px}.sl-h-full{height:100%}.sl-h-screen{height:100vh}.sl-inset-0{bottom:0;left:0;right:0;top:0}.sl-inset-1{bottom:4px;left:4px;right:4px;top:4px}.sl-inset-2{bottom:8px;left:8px;right:8px;top:8px}.sl-inset-3{bottom:12px;left:12px;right:12px;top:12px}.sl-inset-4{bottom:16px;left:16px;right:16px;top:16px}.sl-inset-5{bottom:20px;left:20px;right:20px;top:20px}.sl-inset-6{bottom:24px;left:24px;right:24px;top:24px}.sl-inset-7{bottom:28px;left:28px;right:28px;top:28px}.sl-inset-8{bottom:32px;left:32px;right:32px;top:32px}.sl-inset-9{bottom:36px;left:36px;right:36px;top:36px}.sl-inset-10{bottom:40px;left:40px;right:40px;top:40px}.sl-inset-11{bottom:44px;left:44px;right:44px;top:44px}.sl-inset-12{bottom:48px;left:48px;right:48px;top:48px}.sl-inset-14{bottom:56px;left:56px;right:56px;top:56px}.sl-inset-16{bottom:64px;left:64px;right:64px;top:64px}.sl-inset-20{bottom:80px;left:80px;right:80px;top:80px}.sl-inset-24{bottom:96px;left:96px;right:96px;top:96px}.sl-inset-28{bottom:112px;left:112px;right:112px;top:112px}.sl-inset-32{bottom:128px;left:128px;right:128px;top:128px}.sl-inset-36{bottom:144px;left:144px;right:144px;top:144px}.sl-inset-40{bottom:160px;left:160px;right:160px;top:160px}.sl-inset-44{bottom:176px;left:176px;right:176px;top:176px}.sl-inset-48{bottom:192px;left:192px;right:192px;top:192px}.sl-inset-52{bottom:208px;left:208px;right:208px;top:208px}.sl-inset-56{bottom:224px;left:224px;right:224px;top:224px}.sl-inset-60{bottom:240px;left:240px;right:240px;top:240px}.sl-inset-64{bottom:256px;left:256px;right:256px;top:256px}.sl-inset-72{bottom:288px;left:288px;right:288px;top:288px}.sl-inset-80{bottom:320px;left:320px;right:320px;top:320px}.sl-inset-96{bottom:384px;left:384px;right:384px;top:384px}.sl-inset-auto{bottom:auto;left:auto;right:auto;top:auto}.sl-inset-px{bottom:1px;left:1px;right:1px;top:1px}.sl-inset-0\.5{bottom:2px;left:2px;right:2px;top:2px}.sl-inset-1\.5{bottom:6px;left:6px;right:6px;top:6px}.sl-inset-2\.5{bottom:10px;left:10px;right:10px;top:10px}.sl-inset-3\.5{bottom:14px;left:14px;right:14px;top:14px}.sl-inset-4\.5{bottom:18px;left:18px;right:18px;top:18px}.sl--inset-0{bottom:0;left:0;right:0;top:0}.sl--inset-1{bottom:-4px;left:-4px;right:-4px;top:-4px}.sl--inset-2{bottom:-8px;left:-8px;right:-8px;top:-8px}.sl--inset-3{bottom:-12px;left:-12px;right:-12px;top:-12px}.sl--inset-4{bottom:-16px;left:-16px;right:-16px;top:-16px}.sl--inset-5{bottom:-20px;left:-20px;right:-20px;top:-20px}.sl--inset-6{bottom:-24px;left:-24px;right:-24px;top:-24px}.sl--inset-7{bottom:-28px;left:-28px;right:-28px;top:-28px}.sl--inset-8{bottom:-32px;left:-32px;right:-32px;top:-32px}.sl--inset-9{bottom:-36px;left:-36px;right:-36px;top:-36px}.sl--inset-10{bottom:-40px;left:-40px;right:-40px;top:-40px}.sl--inset-11{bottom:-44px;left:-44px;right:-44px;top:-44px}.sl--inset-12{bottom:-48px;left:-48px;right:-48px;top:-48px}.sl--inset-14{bottom:-56px;left:-56px;right:-56px;top:-56px}.sl--inset-16{bottom:-64px;left:-64px;right:-64px;top:-64px}.sl--inset-20{bottom:-80px;left:-80px;right:-80px;top:-80px}.sl--inset-24{bottom:-96px;left:-96px;right:-96px;top:-96px}.sl--inset-28{bottom:-112px;left:-112px;right:-112px;top:-112px}.sl--inset-32{bottom:-128px;left:-128px;right:-128px;top:-128px}.sl--inset-36{bottom:-144px;left:-144px;right:-144px;top:-144px}.sl--inset-40{bottom:-160px;left:-160px;right:-160px;top:-160px}.sl--inset-44{bottom:-176px;left:-176px;right:-176px;top:-176px}.sl--inset-48{bottom:-192px;left:-192px;right:-192px;top:-192px}.sl--inset-52{bottom:-208px;left:-208px;right:-208px;top:-208px}.sl--inset-56{bottom:-224px;left:-224px;right:-224px;top:-224px}.sl--inset-60{bottom:-240px;left:-240px;right:-240px;top:-240px}.sl--inset-64{bottom:-256px;left:-256px;right:-256px;top:-256px}.sl--inset-72{bottom:-288px;left:-288px;right:-288px;top:-288px}.sl--inset-80{bottom:-320px;left:-320px;right:-320px;top:-320px}.sl--inset-96{bottom:-384px;left:-384px;right:-384px;top:-384px}.sl--inset-px{bottom:-1px;left:-1px;right:-1px;top:-1px}.sl--inset-0\.5{bottom:-2px;left:-2px;right:-2px;top:-2px}.sl--inset-1\.5{bottom:-6px;left:-6px;right:-6px;top:-6px}.sl--inset-2\.5{bottom:-10px;left:-10px;right:-10px;top:-10px}.sl--inset-3\.5{bottom:-14px;left:-14px;right:-14px;top:-14px}.sl--inset-4\.5{bottom:-18px;left:-18px;right:-18px;top:-18px}.sl-inset-y-0{bottom:0;top:0}.sl-inset-x-0{left:0;right:0}.sl-inset-y-1{bottom:4px;top:4px}.sl-inset-x-1{left:4px;right:4px}.sl-inset-y-2{bottom:8px;top:8px}.sl-inset-x-2{left:8px;right:8px}.sl-inset-y-3{bottom:12px;top:12px}.sl-inset-x-3{left:12px;right:12px}.sl-inset-y-4{bottom:16px;top:16px}.sl-inset-x-4{left:16px;right:16px}.sl-inset-y-5{bottom:20px;top:20px}.sl-inset-x-5{left:20px;right:20px}.sl-inset-y-6{bottom:24px;top:24px}.sl-inset-x-6{left:24px;right:24px}.sl-inset-y-7{bottom:28px;top:28px}.sl-inset-x-7{left:28px;right:28px}.sl-inset-y-8{bottom:32px;top:32px}.sl-inset-x-8{left:32px;right:32px}.sl-inset-y-9{bottom:36px;top:36px}.sl-inset-x-9{left:36px;right:36px}.sl-inset-y-10{bottom:40px;top:40px}.sl-inset-x-10{left:40px;right:40px}.sl-inset-y-11{bottom:44px;top:44px}.sl-inset-x-11{left:44px;right:44px}.sl-inset-y-12{bottom:48px;top:48px}.sl-inset-x-12{left:48px;right:48px}.sl-inset-y-14{bottom:56px;top:56px}.sl-inset-x-14{left:56px;right:56px}.sl-inset-y-16{bottom:64px;top:64px}.sl-inset-x-16{left:64px;right:64px}.sl-inset-y-20{bottom:80px;top:80px}.sl-inset-x-20{left:80px;right:80px}.sl-inset-y-24{bottom:96px;top:96px}.sl-inset-x-24{left:96px;right:96px}.sl-inset-y-28{bottom:112px;top:112px}.sl-inset-x-28{left:112px;right:112px}.sl-inset-y-32{bottom:128px;top:128px}.sl-inset-x-32{left:128px;right:128px}.sl-inset-y-36{bottom:144px;top:144px}.sl-inset-x-36{left:144px;right:144px}.sl-inset-y-40{bottom:160px;top:160px}.sl-inset-x-40{left:160px;right:160px}.sl-inset-y-44{bottom:176px;top:176px}.sl-inset-x-44{left:176px;right:176px}.sl-inset-y-48{bottom:192px;top:192px}.sl-inset-x-48{left:192px;right:192px}.sl-inset-y-52{bottom:208px;top:208px}.sl-inset-x-52{left:208px;right:208px}.sl-inset-y-56{bottom:224px;top:224px}.sl-inset-x-56{left:224px;right:224px}.sl-inset-y-60{bottom:240px;top:240px}.sl-inset-x-60{left:240px;right:240px}.sl-inset-y-64{bottom:256px;top:256px}.sl-inset-x-64{left:256px;right:256px}.sl-inset-y-72{bottom:288px;top:288px}.sl-inset-x-72{left:288px;right:288px}.sl-inset-y-80{bottom:320px;top:320px}.sl-inset-x-80{left:320px;right:320px}.sl-inset-y-96{bottom:384px;top:384px}.sl-inset-x-96{left:384px;right:384px}.sl-inset-y-auto{bottom:auto;top:auto}.sl-inset-x-auto{left:auto;right:auto}.sl-inset-y-px{bottom:1px;top:1px}.sl-inset-x-px{left:1px;right:1px}.sl-inset-y-0\.5{bottom:2px;top:2px}.sl-inset-x-0\.5{left:2px;right:2px}.sl-inset-y-1\.5{bottom:6px;top:6px}.sl-inset-x-1\.5{left:6px;right:6px}.sl-inset-y-2\.5{bottom:10px;top:10px}.sl-inset-x-2\.5{left:10px;right:10px}.sl-inset-y-3\.5{bottom:14px;top:14px}.sl-inset-x-3\.5{left:14px;right:14px}.sl-inset-y-4\.5{bottom:18px;top:18px}.sl-inset-x-4\.5{left:18px;right:18px}.sl--inset-y-0{bottom:0;top:0}.sl--inset-x-0{left:0;right:0}.sl--inset-y-1{bottom:-4px;top:-4px}.sl--inset-x-1{left:-4px;right:-4px}.sl--inset-y-2{bottom:-8px;top:-8px}.sl--inset-x-2{left:-8px;right:-8px}.sl--inset-y-3{bottom:-12px;top:-12px}.sl--inset-x-3{left:-12px;right:-12px}.sl--inset-y-4{bottom:-16px;top:-16px}.sl--inset-x-4{left:-16px;right:-16px}.sl--inset-y-5{bottom:-20px;top:-20px}.sl--inset-x-5{left:-20px;right:-20px}.sl--inset-y-6{bottom:-24px;top:-24px}.sl--inset-x-6{left:-24px;right:-24px}.sl--inset-y-7{bottom:-28px;top:-28px}.sl--inset-x-7{left:-28px;right:-28px}.sl--inset-y-8{bottom:-32px;top:-32px}.sl--inset-x-8{left:-32px;right:-32px}.sl--inset-y-9{bottom:-36px;top:-36px}.sl--inset-x-9{left:-36px;right:-36px}.sl--inset-y-10{bottom:-40px;top:-40px}.sl--inset-x-10{left:-40px;right:-40px}.sl--inset-y-11{bottom:-44px;top:-44px}.sl--inset-x-11{left:-44px;right:-44px}.sl--inset-y-12{bottom:-48px;top:-48px}.sl--inset-x-12{left:-48px;right:-48px}.sl--inset-y-14{bottom:-56px;top:-56px}.sl--inset-x-14{left:-56px;right:-56px}.sl--inset-y-16{bottom:-64px;top:-64px}.sl--inset-x-16{left:-64px;right:-64px}.sl--inset-y-20{bottom:-80px;top:-80px}.sl--inset-x-20{left:-80px;right:-80px}.sl--inset-y-24{bottom:-96px;top:-96px}.sl--inset-x-24{left:-96px;right:-96px}.sl--inset-y-28{bottom:-112px;top:-112px}.sl--inset-x-28{left:-112px;right:-112px}.sl--inset-y-32{bottom:-128px;top:-128px}.sl--inset-x-32{left:-128px;right:-128px}.sl--inset-y-36{bottom:-144px;top:-144px}.sl--inset-x-36{left:-144px;right:-144px}.sl--inset-y-40{bottom:-160px;top:-160px}.sl--inset-x-40{left:-160px;right:-160px}.sl--inset-y-44{bottom:-176px;top:-176px}.sl--inset-x-44{left:-176px;right:-176px}.sl--inset-y-48{bottom:-192px;top:-192px}.sl--inset-x-48{left:-192px;right:-192px}.sl--inset-y-52{bottom:-208px;top:-208px}.sl--inset-x-52{left:-208px;right:-208px}.sl--inset-y-56{bottom:-224px;top:-224px}.sl--inset-x-56{left:-224px;right:-224px}.sl--inset-y-60{bottom:-240px;top:-240px}.sl--inset-x-60{left:-240px;right:-240px}.sl--inset-y-64{bottom:-256px;top:-256px}.sl--inset-x-64{left:-256px;right:-256px}.sl--inset-y-72{bottom:-288px;top:-288px}.sl--inset-x-72{left:-288px;right:-288px}.sl--inset-y-80{bottom:-320px;top:-320px}.sl--inset-x-80{left:-320px;right:-320px}.sl--inset-y-96{bottom:-384px;top:-384px}.sl--inset-x-96{left:-384px;right:-384px}.sl--inset-y-px{bottom:-1px;top:-1px}.sl--inset-x-px{left:-1px;right:-1px}.sl--inset-y-0\.5{bottom:-2px;top:-2px}.sl--inset-x-0\.5{left:-2px;right:-2px}.sl--inset-y-1\.5{bottom:-6px;top:-6px}.sl--inset-x-1\.5{left:-6px;right:-6px}.sl--inset-y-2\.5{bottom:-10px;top:-10px}.sl--inset-x-2\.5{left:-10px;right:-10px}.sl--inset-y-3\.5{bottom:-14px;top:-14px}.sl--inset-x-3\.5{left:-14px;right:-14px}.sl--inset-y-4\.5{bottom:-18px;top:-18px}.sl--inset-x-4\.5{left:-18px;right:-18px}.sl-top-0{top:0}.sl-right-0{right:0}.sl-bottom-0{bottom:0}.sl-left-0{left:0}.sl-top-1{top:4px}.sl-right-1{right:4px}.sl-bottom-1{bottom:4px}.sl-left-1{left:4px}.sl-top-2{top:8px}.sl-right-2{right:8px}.sl-bottom-2{bottom:8px}.sl-left-2{left:8px}.sl-top-3{top:12px}.sl-right-3{right:12px}.sl-bottom-3{bottom:12px}.sl-left-3{left:12px}.sl-top-4{top:16px}.sl-right-4{right:16px}.sl-bottom-4{bottom:16px}.sl-left-4{left:16px}.sl-top-5{top:20px}.sl-right-5{right:20px}.sl-bottom-5{bottom:20px}.sl-left-5{left:20px}.sl-top-6{top:24px}.sl-right-6{right:24px}.sl-bottom-6{bottom:24px}.sl-left-6{left:24px}.sl-top-7{top:28px}.sl-right-7{right:28px}.sl-bottom-7{bottom:28px}.sl-left-7{left:28px}.sl-top-8{top:32px}.sl-right-8{right:32px}.sl-bottom-8{bottom:32px}.sl-left-8{left:32px}.sl-top-9{top:36px}.sl-right-9{right:36px}.sl-bottom-9{bottom:36px}.sl-left-9{left:36px}.sl-top-10{top:40px}.sl-right-10{right:40px}.sl-bottom-10{bottom:40px}.sl-left-10{left:40px}.sl-top-11{top:44px}.sl-right-11{right:44px}.sl-bottom-11{bottom:44px}.sl-left-11{left:44px}.sl-top-12{top:48px}.sl-right-12{right:48px}.sl-bottom-12{bottom:48px}.sl-left-12{left:48px}.sl-top-14{top:56px}.sl-right-14{right:56px}.sl-bottom-14{bottom:56px}.sl-left-14{left:56px}.sl-top-16{top:64px}.sl-right-16{right:64px}.sl-bottom-16{bottom:64px}.sl-left-16{left:64px}.sl-top-20{top:80px}.sl-right-20{right:80px}.sl-bottom-20{bottom:80px}.sl-left-20{left:80px}.sl-top-24{top:96px}.sl-right-24{right:96px}.sl-bottom-24{bottom:96px}.sl-left-24{left:96px}.sl-top-28{top:112px}.sl-right-28{right:112px}.sl-bottom-28{bottom:112px}.sl-left-28{left:112px}.sl-top-32{top:128px}.sl-right-32{right:128px}.sl-bottom-32{bottom:128px}.sl-left-32{left:128px}.sl-top-36{top:144px}.sl-right-36{right:144px}.sl-bottom-36{bottom:144px}.sl-left-36{left:144px}.sl-top-40{top:160px}.sl-right-40{right:160px}.sl-bottom-40{bottom:160px}.sl-left-40{left:160px}.sl-top-44{top:176px}.sl-right-44{right:176px}.sl-bottom-44{bottom:176px}.sl-left-44{left:176px}.sl-top-48{top:192px}.sl-right-48{right:192px}.sl-bottom-48{bottom:192px}.sl-left-48{left:192px}.sl-top-52{top:208px}.sl-right-52{right:208px}.sl-bottom-52{bottom:208px}.sl-left-52{left:208px}.sl-top-56{top:224px}.sl-right-56{right:224px}.sl-bottom-56{bottom:224px}.sl-left-56{left:224px}.sl-top-60{top:240px}.sl-right-60{right:240px}.sl-bottom-60{bottom:240px}.sl-left-60{left:240px}.sl-top-64{top:256px}.sl-right-64{right:256px}.sl-bottom-64{bottom:256px}.sl-left-64{left:256px}.sl-top-72{top:288px}.sl-right-72{right:288px}.sl-bottom-72{bottom:288px}.sl-left-72{left:288px}.sl-top-80{top:320px}.sl-right-80{right:320px}.sl-bottom-80{bottom:320px}.sl-left-80{left:320px}.sl-top-96{top:384px}.sl-right-96{right:384px}.sl-bottom-96{bottom:384px}.sl-left-96{left:384px}.sl-top-auto{top:auto}.sl-right-auto{right:auto}.sl-bottom-auto{bottom:auto}.sl-left-auto{left:auto}.sl-top-px{top:1px}.sl-right-px{right:1px}.sl-bottom-px{bottom:1px}.sl-left-px{left:1px}.sl-top-0\.5{top:2px}.sl-right-0\.5{right:2px}.sl-bottom-0\.5{bottom:2px}.sl-left-0\.5{left:2px}.sl-top-1\.5{top:6px}.sl-right-1\.5{right:6px}.sl-bottom-1\.5{bottom:6px}.sl-left-1\.5{left:6px}.sl-top-2\.5{top:10px}.sl-right-2\.5{right:10px}.sl-bottom-2\.5{bottom:10px}.sl-left-2\.5{left:10px}.sl-top-3\.5{top:14px}.sl-right-3\.5{right:14px}.sl-bottom-3\.5{bottom:14px}.sl-left-3\.5{left:14px}.sl-top-4\.5{top:18px}.sl-right-4\.5{right:18px}.sl-bottom-4\.5{bottom:18px}.sl-left-4\.5{left:18px}.sl--top-0{top:0}.sl--right-0{right:0}.sl--bottom-0{bottom:0}.sl--left-0{left:0}.sl--top-1{top:-4px}.sl--right-1{right:-4px}.sl--bottom-1{bottom:-4px}.sl--left-1{left:-4px}.sl--top-2{top:-8px}.sl--right-2{right:-8px}.sl--bottom-2{bottom:-8px}.sl--left-2{left:-8px}.sl--top-3{top:-12px}.sl--right-3{right:-12px}.sl--bottom-3{bottom:-12px}.sl--left-3{left:-12px}.sl--top-4{top:-16px}.sl--right-4{right:-16px}.sl--bottom-4{bottom:-16px}.sl--left-4{left:-16px}.sl--top-5{top:-20px}.sl--right-5{right:-20px}.sl--bottom-5{bottom:-20px}.sl--left-5{left:-20px}.sl--top-6{top:-24px}.sl--right-6{right:-24px}.sl--bottom-6{bottom:-24px}.sl--left-6{left:-24px}.sl--top-7{top:-28px}.sl--right-7{right:-28px}.sl--bottom-7{bottom:-28px}.sl--left-7{left:-28px}.sl--top-8{top:-32px}.sl--right-8{right:-32px}.sl--bottom-8{bottom:-32px}.sl--left-8{left:-32px}.sl--top-9{top:-36px}.sl--right-9{right:-36px}.sl--bottom-9{bottom:-36px}.sl--left-9{left:-36px}.sl--top-10{top:-40px}.sl--right-10{right:-40px}.sl--bottom-10{bottom:-40px}.sl--left-10{left:-40px}.sl--top-11{top:-44px}.sl--right-11{right:-44px}.sl--bottom-11{bottom:-44px}.sl--left-11{left:-44px}.sl--top-12{top:-48px}.sl--right-12{right:-48px}.sl--bottom-12{bottom:-48px}.sl--left-12{left:-48px}.sl--top-14{top:-56px}.sl--right-14{right:-56px}.sl--bottom-14{bottom:-56px}.sl--left-14{left:-56px}.sl--top-16{top:-64px}.sl--right-16{right:-64px}.sl--bottom-16{bottom:-64px}.sl--left-16{left:-64px}.sl--top-20{top:-80px}.sl--right-20{right:-80px}.sl--bottom-20{bottom:-80px}.sl--left-20{left:-80px}.sl--top-24{top:-96px}.sl--right-24{right:-96px}.sl--bottom-24{bottom:-96px}.sl--left-24{left:-96px}.sl--top-28{top:-112px}.sl--right-28{right:-112px}.sl--bottom-28{bottom:-112px}.sl--left-28{left:-112px}.sl--top-32{top:-128px}.sl--right-32{right:-128px}.sl--bottom-32{bottom:-128px}.sl--left-32{left:-128px}.sl--top-36{top:-144px}.sl--right-36{right:-144px}.sl--bottom-36{bottom:-144px}.sl--left-36{left:-144px}.sl--top-40{top:-160px}.sl--right-40{right:-160px}.sl--bottom-40{bottom:-160px}.sl--left-40{left:-160px}.sl--top-44{top:-176px}.sl--right-44{right:-176px}.sl--bottom-44{bottom:-176px}.sl--left-44{left:-176px}.sl--top-48{top:-192px}.sl--right-48{right:-192px}.sl--bottom-48{bottom:-192px}.sl--left-48{left:-192px}.sl--top-52{top:-208px}.sl--right-52{right:-208px}.sl--bottom-52{bottom:-208px}.sl--left-52{left:-208px}.sl--top-56{top:-224px}.sl--right-56{right:-224px}.sl--bottom-56{bottom:-224px}.sl--left-56{left:-224px}.sl--top-60{top:-240px}.sl--right-60{right:-240px}.sl--bottom-60{bottom:-240px}.sl--left-60{left:-240px}.sl--top-64{top:-256px}.sl--right-64{right:-256px}.sl--bottom-64{bottom:-256px}.sl--left-64{left:-256px}.sl--top-72{top:-288px}.sl--right-72{right:-288px}.sl--bottom-72{bottom:-288px}.sl--left-72{left:-288px}.sl--top-80{top:-320px}.sl--right-80{right:-320px}.sl--bottom-80{bottom:-320px}.sl--left-80{left:-320px}.sl--top-96{top:-384px}.sl--right-96{right:-384px}.sl--bottom-96{bottom:-384px}.sl--left-96{left:-384px}.sl--top-px{top:-1px}.sl--right-px{right:-1px}.sl--bottom-px{bottom:-1px}.sl--left-px{left:-1px}.sl--top-0\.5{top:-2px}.sl--right-0\.5{right:-2px}.sl--bottom-0\.5{bottom:-2px}.sl--left-0\.5{left:-2px}.sl--top-1\.5{top:-6px}.sl--right-1\.5{right:-6px}.sl--bottom-1\.5{bottom:-6px}.sl--left-1\.5{left:-6px}.sl--top-2\.5{top:-10px}.sl--right-2\.5{right:-10px}.sl--bottom-2\.5{bottom:-10px}.sl--left-2\.5{left:-10px}.sl--top-3\.5{top:-14px}.sl--right-3\.5{right:-14px}.sl--bottom-3\.5{bottom:-14px}.sl--left-3\.5{left:-14px}.sl--top-4\.5{top:-18px}.sl--right-4\.5{right:-18px}.sl--bottom-4\.5{bottom:-18px}.sl--left-4\.5{left:-18px}.sl-justify-start{justify-content:flex-start}.sl-justify-end{justify-content:flex-end}.sl-justify-center{justify-content:center}.sl-justify-between{justify-content:space-between}.sl-justify-around{justify-content:space-around}.sl-justify-evenly{justify-content:space-evenly}.sl-justify-items-start{justify-items:start}.sl-justify-items-end{justify-items:end}.sl-justify-items-center{justify-items:center}.sl-justify-items-stretch{justify-items:stretch}.sl-justify-self-auto{justify-self:auto}.sl-justify-self-start{justify-self:start}.sl-justify-self-end{justify-self:end}.sl-justify-self-center{justify-self:center}.sl-justify-self-stretch{justify-self:stretch}.sl-tracking-tight{letter-spacing:-.025em}.sl-tracking-normal{letter-spacing:0}.sl-tracking-wide{letter-spacing:.025em}.sl-leading-none{line-height:1}.sl-leading-tight{line-height:1.2}.sl-leading-snug{line-height:1.375}.sl-leading-normal{line-height:1.5}.sl-leading-relaxed{line-height:1.625}.sl-leading-loose{line-height:2}.sl-leading-paragraph-leading{line-height:var(--lh-paragraph-leading)}.sl-leading-paragraph{line-height:var(--lh-paragraph)}.sl-leading-paragraph-small{line-height:var(--lh-paragraph-small)}.sl-leading-paragraph-tiny{line-height:var(--lh-paragraph-tiny)}.sl-m-0{margin:0}.sl-m-1{margin:4px}.sl-m-2{margin:8px}.sl-m-3{margin:12px}.sl-m-4{margin:16px}.sl-m-5{margin:20px}.sl-m-6{margin:24px}.sl-m-7{margin:28px}.sl-m-8{margin:32px}.sl-m-9{margin:36px}.sl-m-10{margin:40px}.sl-m-11{margin:44px}.sl-m-12{margin:48px}.sl-m-14{margin:56px}.sl-m-16{margin:64px}.sl-m-20{margin:80px}.sl-m-24{margin:96px}.sl-m-28{margin:112px}.sl-m-32{margin:128px}.sl-m-36{margin:144px}.sl-m-40{margin:160px}.sl-m-44{margin:176px}.sl-m-48{margin:192px}.sl-m-52{margin:208px}.sl-m-56{margin:224px}.sl-m-60{margin:240px}.sl-m-64{margin:256px}.sl-m-72{margin:288px}.sl-m-80{margin:320px}.sl-m-96{margin:384px}.sl-m-auto{margin:auto}.sl-m-px{margin:1px}.sl-m-0\.5{margin:2px}.sl-m-1\.5{margin:6px}.sl-m-2\.5{margin:10px}.sl-m-3\.5{margin:14px}.sl-m-4\.5{margin:18px}.sl--m-0{margin:0}.sl--m-1{margin:-4px}.sl--m-2{margin:-8px}.sl--m-3{margin:-12px}.sl--m-4{margin:-16px}.sl--m-5{margin:-20px}.sl--m-6{margin:-24px}.sl--m-7{margin:-28px}.sl--m-8{margin:-32px}.sl--m-9{margin:-36px}.sl--m-10{margin:-40px}.sl--m-11{margin:-44px}.sl--m-12{margin:-48px}.sl--m-14{margin:-56px}.sl--m-16{margin:-64px}.sl--m-20{margin:-80px}.sl--m-24{margin:-96px}.sl--m-28{margin:-112px}.sl--m-32{margin:-128px}.sl--m-36{margin:-144px}.sl--m-40{margin:-160px}.sl--m-44{margin:-176px}.sl--m-48{margin:-192px}.sl--m-52{margin:-208px}.sl--m-56{margin:-224px}.sl--m-60{margin:-240px}.sl--m-64{margin:-256px}.sl--m-72{margin:-288px}.sl--m-80{margin:-320px}.sl--m-96{margin:-384px}.sl--m-px{margin:-1px}.sl--m-0\.5{margin:-2px}.sl--m-1\.5{margin:-6px}.sl--m-2\.5{margin:-10px}.sl--m-3\.5{margin:-14px}.sl--m-4\.5{margin:-18px}.sl-my-0{margin-bottom:0;margin-top:0}.sl-mx-0{margin-left:0;margin-right:0}.sl-my-1{margin-bottom:4px;margin-top:4px}.sl-mx-1{margin-left:4px;margin-right:4px}.sl-my-2{margin-bottom:8px;margin-top:8px}.sl-mx-2{margin-left:8px;margin-right:8px}.sl-my-3{margin-bottom:12px;margin-top:12px}.sl-mx-3{margin-left:12px;margin-right:12px}.sl-my-4{margin-bottom:16px;margin-top:16px}.sl-mx-4{margin-left:16px;margin-right:16px}.sl-my-5{margin-bottom:20px;margin-top:20px}.sl-mx-5{margin-left:20px;margin-right:20px}.sl-my-6{margin-bottom:24px;margin-top:24px}.sl-mx-6{margin-left:24px;margin-right:24px}.sl-my-7{margin-bottom:28px;margin-top:28px}.sl-mx-7{margin-left:28px;margin-right:28px}.sl-my-8{margin-bottom:32px;margin-top:32px}.sl-mx-8{margin-left:32px;margin-right:32px}.sl-my-9{margin-bottom:36px;margin-top:36px}.sl-mx-9{margin-left:36px;margin-right:36px}.sl-my-10{margin-bottom:40px;margin-top:40px}.sl-mx-10{margin-left:40px;margin-right:40px}.sl-my-11{margin-bottom:44px;margin-top:44px}.sl-mx-11{margin-left:44px;margin-right:44px}.sl-my-12{margin-bottom:48px;margin-top:48px}.sl-mx-12{margin-left:48px;margin-right:48px}.sl-my-14{margin-bottom:56px;margin-top:56px}.sl-mx-14{margin-left:56px;margin-right:56px}.sl-my-16{margin-bottom:64px;margin-top:64px}.sl-mx-16{margin-left:64px;margin-right:64px}.sl-my-20{margin-bottom:80px;margin-top:80px}.sl-mx-20{margin-left:80px;margin-right:80px}.sl-my-24{margin-bottom:96px;margin-top:96px}.sl-mx-24{margin-left:96px;margin-right:96px}.sl-my-28{margin-bottom:112px;margin-top:112px}.sl-mx-28{margin-left:112px;margin-right:112px}.sl-my-32{margin-bottom:128px;margin-top:128px}.sl-mx-32{margin-left:128px;margin-right:128px}.sl-my-36{margin-bottom:144px;margin-top:144px}.sl-mx-36{margin-left:144px;margin-right:144px}.sl-my-40{margin-bottom:160px;margin-top:160px}.sl-mx-40{margin-left:160px;margin-right:160px}.sl-my-44{margin-bottom:176px;margin-top:176px}.sl-mx-44{margin-left:176px;margin-right:176px}.sl-my-48{margin-bottom:192px;margin-top:192px}.sl-mx-48{margin-left:192px;margin-right:192px}.sl-my-52{margin-bottom:208px;margin-top:208px}.sl-mx-52{margin-left:208px;margin-right:208px}.sl-my-56{margin-bottom:224px;margin-top:224px}.sl-mx-56{margin-left:224px;margin-right:224px}.sl-my-60{margin-bottom:240px;margin-top:240px}.sl-mx-60{margin-left:240px;margin-right:240px}.sl-my-64{margin-bottom:256px;margin-top:256px}.sl-mx-64{margin-left:256px;margin-right:256px}.sl-my-72{margin-bottom:288px;margin-top:288px}.sl-mx-72{margin-left:288px;margin-right:288px}.sl-my-80{margin-bottom:320px;margin-top:320px}.sl-mx-80{margin-left:320px;margin-right:320px}.sl-my-96{margin-bottom:384px;margin-top:384px}.sl-mx-96{margin-left:384px;margin-right:384px}.sl-my-auto{margin-bottom:auto;margin-top:auto}.sl-mx-auto{margin-left:auto;margin-right:auto}.sl-my-px{margin-bottom:1px;margin-top:1px}.sl-mx-px{margin-left:1px;margin-right:1px}.sl-my-0\.5{margin-bottom:2px;margin-top:2px}.sl-mx-0\.5{margin-left:2px;margin-right:2px}.sl-my-1\.5{margin-bottom:6px;margin-top:6px}.sl-mx-1\.5{margin-left:6px;margin-right:6px}.sl-my-2\.5{margin-bottom:10px;margin-top:10px}.sl-mx-2\.5{margin-left:10px;margin-right:10px}.sl-my-3\.5{margin-bottom:14px;margin-top:14px}.sl-mx-3\.5{margin-left:14px;margin-right:14px}.sl-my-4\.5{margin-bottom:18px;margin-top:18px}.sl-mx-4\.5{margin-left:18px;margin-right:18px}.sl--my-0{margin-bottom:0;margin-top:0}.sl--mx-0{margin-left:0;margin-right:0}.sl--my-1{margin-bottom:-4px;margin-top:-4px}.sl--mx-1{margin-left:-4px;margin-right:-4px}.sl--my-2{margin-bottom:-8px;margin-top:-8px}.sl--mx-2{margin-left:-8px;margin-right:-8px}.sl--my-3{margin-bottom:-12px;margin-top:-12px}.sl--mx-3{margin-left:-12px;margin-right:-12px}.sl--my-4{margin-bottom:-16px;margin-top:-16px}.sl--mx-4{margin-left:-16px;margin-right:-16px}.sl--my-5{margin-bottom:-20px;margin-top:-20px}.sl--mx-5{margin-left:-20px;margin-right:-20px}.sl--my-6{margin-bottom:-24px;margin-top:-24px}.sl--mx-6{margin-left:-24px;margin-right:-24px}.sl--my-7{margin-bottom:-28px;margin-top:-28px}.sl--mx-7{margin-left:-28px;margin-right:-28px}.sl--my-8{margin-bottom:-32px;margin-top:-32px}.sl--mx-8{margin-left:-32px;margin-right:-32px}.sl--my-9{margin-bottom:-36px;margin-top:-36px}.sl--mx-9{margin-left:-36px;margin-right:-36px}.sl--my-10{margin-bottom:-40px;margin-top:-40px}.sl--mx-10{margin-left:-40px;margin-right:-40px}.sl--my-11{margin-bottom:-44px;margin-top:-44px}.sl--mx-11{margin-left:-44px;margin-right:-44px}.sl--my-12{margin-bottom:-48px;margin-top:-48px}.sl--mx-12{margin-left:-48px;margin-right:-48px}.sl--my-14{margin-bottom:-56px;margin-top:-56px}.sl--mx-14{margin-left:-56px;margin-right:-56px}.sl--my-16{margin-bottom:-64px;margin-top:-64px}.sl--mx-16{margin-left:-64px;margin-right:-64px}.sl--my-20{margin-bottom:-80px;margin-top:-80px}.sl--mx-20{margin-left:-80px;margin-right:-80px}.sl--my-24{margin-bottom:-96px;margin-top:-96px}.sl--mx-24{margin-left:-96px;margin-right:-96px}.sl--my-28{margin-bottom:-112px;margin-top:-112px}.sl--mx-28{margin-left:-112px;margin-right:-112px}.sl--my-32{margin-bottom:-128px;margin-top:-128px}.sl--mx-32{margin-left:-128px;margin-right:-128px}.sl--my-36{margin-bottom:-144px;margin-top:-144px}.sl--mx-36{margin-left:-144px;margin-right:-144px}.sl--my-40{margin-bottom:-160px;margin-top:-160px}.sl--mx-40{margin-left:-160px;margin-right:-160px}.sl--my-44{margin-bottom:-176px;margin-top:-176px}.sl--mx-44{margin-left:-176px;margin-right:-176px}.sl--my-48{margin-bottom:-192px;margin-top:-192px}.sl--mx-48{margin-left:-192px;margin-right:-192px}.sl--my-52{margin-bottom:-208px;margin-top:-208px}.sl--mx-52{margin-left:-208px;margin-right:-208px}.sl--my-56{margin-bottom:-224px;margin-top:-224px}.sl--mx-56{margin-left:-224px;margin-right:-224px}.sl--my-60{margin-bottom:-240px;margin-top:-240px}.sl--mx-60{margin-left:-240px;margin-right:-240px}.sl--my-64{margin-bottom:-256px;margin-top:-256px}.sl--mx-64{margin-left:-256px;margin-right:-256px}.sl--my-72{margin-bottom:-288px;margin-top:-288px}.sl--mx-72{margin-left:-288px;margin-right:-288px}.sl--my-80{margin-bottom:-320px;margin-top:-320px}.sl--mx-80{margin-left:-320px;margin-right:-320px}.sl--my-96{margin-bottom:-384px;margin-top:-384px}.sl--mx-96{margin-left:-384px;margin-right:-384px}.sl--my-px{margin-bottom:-1px;margin-top:-1px}.sl--mx-px{margin-left:-1px;margin-right:-1px}.sl--my-0\.5{margin-bottom:-2px;margin-top:-2px}.sl--mx-0\.5{margin-left:-2px;margin-right:-2px}.sl--my-1\.5{margin-bottom:-6px;margin-top:-6px}.sl--mx-1\.5{margin-left:-6px;margin-right:-6px}.sl--my-2\.5{margin-bottom:-10px;margin-top:-10px}.sl--mx-2\.5{margin-left:-10px;margin-right:-10px}.sl--my-3\.5{margin-bottom:-14px;margin-top:-14px}.sl--mx-3\.5{margin-left:-14px;margin-right:-14px}.sl--my-4\.5{margin-bottom:-18px;margin-top:-18px}.sl--mx-4\.5{margin-left:-18px;margin-right:-18px}.sl-mt-0{margin-top:0}.sl-mr-0{margin-right:0}.sl-mb-0{margin-bottom:0}.sl-ml-0{margin-left:0}.sl-mt-1{margin-top:4px}.sl-mr-1{margin-right:4px}.sl-mb-1{margin-bottom:4px}.sl-ml-1{margin-left:4px}.sl-mt-2{margin-top:8px}.sl-mr-2{margin-right:8px}.sl-mb-2{margin-bottom:8px}.sl-ml-2{margin-left:8px}.sl-mt-3{margin-top:12px}.sl-mr-3{margin-right:12px}.sl-mb-3{margin-bottom:12px}.sl-ml-3{margin-left:12px}.sl-mt-4{margin-top:16px}.sl-mr-4{margin-right:16px}.sl-mb-4{margin-bottom:16px}.sl-ml-4{margin-left:16px}.sl-mt-5{margin-top:20px}.sl-mr-5{margin-right:20px}.sl-mb-5{margin-bottom:20px}.sl-ml-5{margin-left:20px}.sl-mt-6{margin-top:24px}.sl-mr-6{margin-right:24px}.sl-mb-6{margin-bottom:24px}.sl-ml-6{margin-left:24px}.sl-mt-7{margin-top:28px}.sl-mr-7{margin-right:28px}.sl-mb-7{margin-bottom:28px}.sl-ml-7{margin-left:28px}.sl-mt-8{margin-top:32px}.sl-mr-8{margin-right:32px}.sl-mb-8{margin-bottom:32px}.sl-ml-8{margin-left:32px}.sl-mt-9{margin-top:36px}.sl-mr-9{margin-right:36px}.sl-mb-9{margin-bottom:36px}.sl-ml-9{margin-left:36px}.sl-mt-10{margin-top:40px}.sl-mr-10{margin-right:40px}.sl-mb-10{margin-bottom:40px}.sl-ml-10{margin-left:40px}.sl-mt-11{margin-top:44px}.sl-mr-11{margin-right:44px}.sl-mb-11{margin-bottom:44px}.sl-ml-11{margin-left:44px}.sl-mt-12{margin-top:48px}.sl-mr-12{margin-right:48px}.sl-mb-12{margin-bottom:48px}.sl-ml-12{margin-left:48px}.sl-mt-14{margin-top:56px}.sl-mr-14{margin-right:56px}.sl-mb-14{margin-bottom:56px}.sl-ml-14{margin-left:56px}.sl-mt-16{margin-top:64px}.sl-mr-16{margin-right:64px}.sl-mb-16{margin-bottom:64px}.sl-ml-16{margin-left:64px}.sl-mt-20{margin-top:80px}.sl-mr-20{margin-right:80px}.sl-mb-20{margin-bottom:80px}.sl-ml-20{margin-left:80px}.sl-mt-24{margin-top:96px}.sl-mr-24{margin-right:96px}.sl-mb-24{margin-bottom:96px}.sl-ml-24{margin-left:96px}.sl-mt-28{margin-top:112px}.sl-mr-28{margin-right:112px}.sl-mb-28{margin-bottom:112px}.sl-ml-28{margin-left:112px}.sl-mt-32{margin-top:128px}.sl-mr-32{margin-right:128px}.sl-mb-32{margin-bottom:128px}.sl-ml-32{margin-left:128px}.sl-mt-36{margin-top:144px}.sl-mr-36{margin-right:144px}.sl-mb-36{margin-bottom:144px}.sl-ml-36{margin-left:144px}.sl-mt-40{margin-top:160px}.sl-mr-40{margin-right:160px}.sl-mb-40{margin-bottom:160px}.sl-ml-40{margin-left:160px}.sl-mt-44{margin-top:176px}.sl-mr-44{margin-right:176px}.sl-mb-44{margin-bottom:176px}.sl-ml-44{margin-left:176px}.sl-mt-48{margin-top:192px}.sl-mr-48{margin-right:192px}.sl-mb-48{margin-bottom:192px}.sl-ml-48{margin-left:192px}.sl-mt-52{margin-top:208px}.sl-mr-52{margin-right:208px}.sl-mb-52{margin-bottom:208px}.sl-ml-52{margin-left:208px}.sl-mt-56{margin-top:224px}.sl-mr-56{margin-right:224px}.sl-mb-56{margin-bottom:224px}.sl-ml-56{margin-left:224px}.sl-mt-60{margin-top:240px}.sl-mr-60{margin-right:240px}.sl-mb-60{margin-bottom:240px}.sl-ml-60{margin-left:240px}.sl-mt-64{margin-top:256px}.sl-mr-64{margin-right:256px}.sl-mb-64{margin-bottom:256px}.sl-ml-64{margin-left:256px}.sl-mt-72{margin-top:288px}.sl-mr-72{margin-right:288px}.sl-mb-72{margin-bottom:288px}.sl-ml-72{margin-left:288px}.sl-mt-80{margin-top:320px}.sl-mr-80{margin-right:320px}.sl-mb-80{margin-bottom:320px}.sl-ml-80{margin-left:320px}.sl-mt-96{margin-top:384px}.sl-mr-96{margin-right:384px}.sl-mb-96{margin-bottom:384px}.sl-ml-96{margin-left:384px}.sl-mt-auto{margin-top:auto}.sl-mr-auto{margin-right:auto}.sl-mb-auto{margin-bottom:auto}.sl-ml-auto{margin-left:auto}.sl-mt-px{margin-top:1px}.sl-mr-px{margin-right:1px}.sl-mb-px{margin-bottom:1px}.sl-ml-px{margin-left:1px}.sl-mt-0\.5{margin-top:2px}.sl-mr-0\.5{margin-right:2px}.sl-mb-0\.5{margin-bottom:2px}.sl-ml-0\.5{margin-left:2px}.sl-mt-1\.5{margin-top:6px}.sl-mr-1\.5{margin-right:6px}.sl-mb-1\.5{margin-bottom:6px}.sl-ml-1\.5{margin-left:6px}.sl-mt-2\.5{margin-top:10px}.sl-mr-2\.5{margin-right:10px}.sl-mb-2\.5{margin-bottom:10px}.sl-ml-2\.5{margin-left:10px}.sl-mt-3\.5{margin-top:14px}.sl-mr-3\.5{margin-right:14px}.sl-mb-3\.5{margin-bottom:14px}.sl-ml-3\.5{margin-left:14px}.sl-mt-4\.5{margin-top:18px}.sl-mr-4\.5{margin-right:18px}.sl-mb-4\.5{margin-bottom:18px}.sl-ml-4\.5{margin-left:18px}.sl--mt-0{margin-top:0}.sl--mr-0{margin-right:0}.sl--mb-0{margin-bottom:0}.sl--ml-0{margin-left:0}.sl--mt-1{margin-top:-4px}.sl--mr-1{margin-right:-4px}.sl--mb-1{margin-bottom:-4px}.sl--ml-1{margin-left:-4px}.sl--mt-2{margin-top:-8px}.sl--mr-2{margin-right:-8px}.sl--mb-2{margin-bottom:-8px}.sl--ml-2{margin-left:-8px}.sl--mt-3{margin-top:-12px}.sl--mr-3{margin-right:-12px}.sl--mb-3{margin-bottom:-12px}.sl--ml-3{margin-left:-12px}.sl--mt-4{margin-top:-16px}.sl--mr-4{margin-right:-16px}.sl--mb-4{margin-bottom:-16px}.sl--ml-4{margin-left:-16px}.sl--mt-5{margin-top:-20px}.sl--mr-5{margin-right:-20px}.sl--mb-5{margin-bottom:-20px}.sl--ml-5{margin-left:-20px}.sl--mt-6{margin-top:-24px}.sl--mr-6{margin-right:-24px}.sl--mb-6{margin-bottom:-24px}.sl--ml-6{margin-left:-24px}.sl--mt-7{margin-top:-28px}.sl--mr-7{margin-right:-28px}.sl--mb-7{margin-bottom:-28px}.sl--ml-7{margin-left:-28px}.sl--mt-8{margin-top:-32px}.sl--mr-8{margin-right:-32px}.sl--mb-8{margin-bottom:-32px}.sl--ml-8{margin-left:-32px}.sl--mt-9{margin-top:-36px}.sl--mr-9{margin-right:-36px}.sl--mb-9{margin-bottom:-36px}.sl--ml-9{margin-left:-36px}.sl--mt-10{margin-top:-40px}.sl--mr-10{margin-right:-40px}.sl--mb-10{margin-bottom:-40px}.sl--ml-10{margin-left:-40px}.sl--mt-11{margin-top:-44px}.sl--mr-11{margin-right:-44px}.sl--mb-11{margin-bottom:-44px}.sl--ml-11{margin-left:-44px}.sl--mt-12{margin-top:-48px}.sl--mr-12{margin-right:-48px}.sl--mb-12{margin-bottom:-48px}.sl--ml-12{margin-left:-48px}.sl--mt-14{margin-top:-56px}.sl--mr-14{margin-right:-56px}.sl--mb-14{margin-bottom:-56px}.sl--ml-14{margin-left:-56px}.sl--mt-16{margin-top:-64px}.sl--mr-16{margin-right:-64px}.sl--mb-16{margin-bottom:-64px}.sl--ml-16{margin-left:-64px}.sl--mt-20{margin-top:-80px}.sl--mr-20{margin-right:-80px}.sl--mb-20{margin-bottom:-80px}.sl--ml-20{margin-left:-80px}.sl--mt-24{margin-top:-96px}.sl--mr-24{margin-right:-96px}.sl--mb-24{margin-bottom:-96px}.sl--ml-24{margin-left:-96px}.sl--mt-28{margin-top:-112px}.sl--mr-28{margin-right:-112px}.sl--mb-28{margin-bottom:-112px}.sl--ml-28{margin-left:-112px}.sl--mt-32{margin-top:-128px}.sl--mr-32{margin-right:-128px}.sl--mb-32{margin-bottom:-128px}.sl--ml-32{margin-left:-128px}.sl--mt-36{margin-top:-144px}.sl--mr-36{margin-right:-144px}.sl--mb-36{margin-bottom:-144px}.sl--ml-36{margin-left:-144px}.sl--mt-40{margin-top:-160px}.sl--mr-40{margin-right:-160px}.sl--mb-40{margin-bottom:-160px}.sl--ml-40{margin-left:-160px}.sl--mt-44{margin-top:-176px}.sl--mr-44{margin-right:-176px}.sl--mb-44{margin-bottom:-176px}.sl--ml-44{margin-left:-176px}.sl--mt-48{margin-top:-192px}.sl--mr-48{margin-right:-192px}.sl--mb-48{margin-bottom:-192px}.sl--ml-48{margin-left:-192px}.sl--mt-52{margin-top:-208px}.sl--mr-52{margin-right:-208px}.sl--mb-52{margin-bottom:-208px}.sl--ml-52{margin-left:-208px}.sl--mt-56{margin-top:-224px}.sl--mr-56{margin-right:-224px}.sl--mb-56{margin-bottom:-224px}.sl--ml-56{margin-left:-224px}.sl--mt-60{margin-top:-240px}.sl--mr-60{margin-right:-240px}.sl--mb-60{margin-bottom:-240px}.sl--ml-60{margin-left:-240px}.sl--mt-64{margin-top:-256px}.sl--mr-64{margin-right:-256px}.sl--mb-64{margin-bottom:-256px}.sl--ml-64{margin-left:-256px}.sl--mt-72{margin-top:-288px}.sl--mr-72{margin-right:-288px}.sl--mb-72{margin-bottom:-288px}.sl--ml-72{margin-left:-288px}.sl--mt-80{margin-top:-320px}.sl--mr-80{margin-right:-320px}.sl--mb-80{margin-bottom:-320px}.sl--ml-80{margin-left:-320px}.sl--mt-96{margin-top:-384px}.sl--mr-96{margin-right:-384px}.sl--mb-96{margin-bottom:-384px}.sl--ml-96{margin-left:-384px}.sl--mt-px{margin-top:-1px}.sl--mr-px{margin-right:-1px}.sl--mb-px{margin-bottom:-1px}.sl--ml-px{margin-left:-1px}.sl--mt-0\.5{margin-top:-2px}.sl--mr-0\.5{margin-right:-2px}.sl--mb-0\.5{margin-bottom:-2px}.sl--ml-0\.5{margin-left:-2px}.sl--mt-1\.5{margin-top:-6px}.sl--mr-1\.5{margin-right:-6px}.sl--mb-1\.5{margin-bottom:-6px}.sl--ml-1\.5{margin-left:-6px}.sl--mt-2\.5{margin-top:-10px}.sl--mr-2\.5{margin-right:-10px}.sl--mb-2\.5{margin-bottom:-10px}.sl--ml-2\.5{margin-left:-10px}.sl--mt-3\.5{margin-top:-14px}.sl--mr-3\.5{margin-right:-14px}.sl--mb-3\.5{margin-bottom:-14px}.sl--ml-3\.5{margin-left:-14px}.sl--mt-4\.5{margin-top:-18px}.sl--mr-4\.5{margin-right:-18px}.sl--mb-4\.5{margin-bottom:-18px}.sl--ml-4\.5{margin-left:-18px}.sl-max-h-full{max-height:100%}.sl-max-h-screen{max-height:100vh}.sl-max-w-none{max-width:none}.sl-max-w-full{max-width:100%}.sl-max-w-min{max-width:-moz-min-content;max-width:min-content}.sl-max-w-max{max-width:-moz-max-content;max-width:max-content}.sl-max-w-prose{max-width:65ch}.sl-min-h-full{min-height:100%}.sl-min-h-screen{min-height:100vh}.sl-min-w-full{min-width:100%}.sl-min-w-min{min-width:-moz-min-content;min-width:min-content}.sl-min-w-max{min-width:-moz-max-content;min-width:max-content}.sl-object-contain{object-fit:contain}.sl-object-cover{object-fit:cover}.sl-object-fill{object-fit:fill}.sl-object-none{object-fit:none}.sl-object-scale-down{object-fit:scale-down}.sl-object-bottom{object-position:bottom}.sl-object-center{object-position:center}.sl-object-left{object-position:left}.sl-object-left-bottom{object-position:left bottom}.sl-object-left-top{object-position:left top}.sl-object-right{object-position:right}.sl-object-right-bottom{object-position:right bottom}.sl-object-right-top{object-position:right top}.sl-object-top{object-position:top}.sl-opacity-0{opacity:0}.sl-opacity-5{opacity:.05}.sl-opacity-10{opacity:.1}.sl-opacity-20{opacity:.2}.sl-opacity-30{opacity:.3}.sl-opacity-40{opacity:.4}.sl-opacity-50{opacity:.5}.sl-opacity-60{opacity:.6}.sl-opacity-70{opacity:.7}.sl-opacity-90{opacity:.9}.sl-opacity-100{opacity:1}.hover\:sl-opacity-0:hover{opacity:0}.hover\:sl-opacity-5:hover{opacity:.05}.hover\:sl-opacity-10:hover{opacity:.1}.hover\:sl-opacity-20:hover{opacity:.2}.hover\:sl-opacity-30:hover{opacity:.3}.hover\:sl-opacity-40:hover{opacity:.4}.hover\:sl-opacity-50:hover{opacity:.5}.hover\:sl-opacity-60:hover{opacity:.6}.hover\:sl-opacity-70:hover{opacity:.7}.hover\:sl-opacity-90:hover{opacity:.9}.hover\:sl-opacity-100:hover{opacity:1}.focus\:sl-opacity-0:focus{opacity:0}.focus\:sl-opacity-5:focus{opacity:.05}.focus\:sl-opacity-10:focus{opacity:.1}.focus\:sl-opacity-20:focus{opacity:.2}.focus\:sl-opacity-30:focus{opacity:.3}.focus\:sl-opacity-40:focus{opacity:.4}.focus\:sl-opacity-50:focus{opacity:.5}.focus\:sl-opacity-60:focus{opacity:.6}.focus\:sl-opacity-70:focus{opacity:.7}.focus\:sl-opacity-90:focus{opacity:.9}.focus\:sl-opacity-100:focus{opacity:1}.active\:sl-opacity-0:active{opacity:0}.active\:sl-opacity-5:active{opacity:.05}.active\:sl-opacity-10:active{opacity:.1}.active\:sl-opacity-20:active{opacity:.2}.active\:sl-opacity-30:active{opacity:.3}.active\:sl-opacity-40:active{opacity:.4}.active\:sl-opacity-50:active{opacity:.5}.active\:sl-opacity-60:active{opacity:.6}.active\:sl-opacity-70:active{opacity:.7}.active\:sl-opacity-90:active{opacity:.9}.active\:sl-opacity-100:active{opacity:1}.disabled\:sl-opacity-0:disabled{opacity:0}.disabled\:sl-opacity-5:disabled{opacity:.05}.disabled\:sl-opacity-10:disabled{opacity:.1}.disabled\:sl-opacity-20:disabled{opacity:.2}.disabled\:sl-opacity-30:disabled{opacity:.3}.disabled\:sl-opacity-40:disabled{opacity:.4}.disabled\:sl-opacity-50:disabled{opacity:.5}.disabled\:sl-opacity-60:disabled{opacity:.6}.disabled\:sl-opacity-70:disabled{opacity:.7}.disabled\:sl-opacity-90:disabled{opacity:.9}.disabled\:sl-opacity-100:disabled{opacity:1}.sl-outline-none{outline:2px solid transparent;outline-offset:2px}.sl-overflow-auto{overflow:auto}.sl-overflow-hidden{overflow:hidden}.sl-overflow-visible{overflow:visible}.sl-overflow-scroll{overflow:scroll}.sl-overflow-x-auto{overflow-x:auto}.sl-overflow-y-auto{overflow-y:auto}.sl-overflow-x-hidden{overflow-x:hidden}.sl-overflow-y-hidden{overflow-y:hidden}.sl-overflow-x-visible{overflow-x:visible}.sl-overflow-y-visible{overflow-y:visible}.sl-overflow-x-scroll{overflow-x:scroll}.sl-overflow-y-scroll{overflow-y:scroll}.sl-overscroll-auto{overscroll-behavior:auto}.sl-overscroll-contain{overscroll-behavior:contain}.sl-overscroll-none{overscroll-behavior:none}.sl-overscroll-y-auto{overscroll-behavior-y:auto}.sl-overscroll-y-contain{overscroll-behavior-y:contain}.sl-overscroll-y-none{overscroll-behavior-y:none}.sl-overscroll-x-auto{overscroll-behavior-x:auto}.sl-overscroll-x-contain{overscroll-behavior-x:contain}.sl-overscroll-x-none{overscroll-behavior-x:none}.sl-p-0{padding:0}.sl-p-1{padding:4px}.sl-p-2{padding:8px}.sl-p-3{padding:12px}.sl-p-4{padding:16px}.sl-p-5{padding:20px}.sl-p-6{padding:24px}.sl-p-7{padding:28px}.sl-p-8{padding:32px}.sl-p-9{padding:36px}.sl-p-10{padding:40px}.sl-p-11{padding:44px}.sl-p-12{padding:48px}.sl-p-14{padding:56px}.sl-p-16{padding:64px}.sl-p-20{padding:80px}.sl-p-24{padding:96px}.sl-p-28{padding:112px}.sl-p-32{padding:128px}.sl-p-36{padding:144px}.sl-p-40{padding:160px}.sl-p-44{padding:176px}.sl-p-48{padding:192px}.sl-p-52{padding:208px}.sl-p-56{padding:224px}.sl-p-60{padding:240px}.sl-p-64{padding:256px}.sl-p-72{padding:288px}.sl-p-80{padding:320px}.sl-p-96{padding:384px}.sl-p-px{padding:1px}.sl-p-0\.5{padding:2px}.sl-p-1\.5{padding:6px}.sl-p-2\.5{padding:10px}.sl-p-3\.5{padding:14px}.sl-p-4\.5{padding:18px}.sl-py-0{padding-bottom:0;padding-top:0}.sl-px-0{padding-left:0;padding-right:0}.sl-py-1{padding-bottom:4px;padding-top:4px}.sl-px-1{padding-left:4px;padding-right:4px}.sl-py-2{padding-bottom:8px;padding-top:8px}.sl-px-2{padding-left:8px;padding-right:8px}.sl-py-3{padding-bottom:12px;padding-top:12px}.sl-px-3{padding-left:12px;padding-right:12px}.sl-py-4{padding-bottom:16px;padding-top:16px}.sl-px-4{padding-left:16px;padding-right:16px}.sl-py-5{padding-bottom:20px;padding-top:20px}.sl-px-5{padding-left:20px;padding-right:20px}.sl-py-6{padding-bottom:24px;padding-top:24px}.sl-px-6{padding-left:24px;padding-right:24px}.sl-py-7{padding-bottom:28px;padding-top:28px}.sl-px-7{padding-left:28px;padding-right:28px}.sl-py-8{padding-bottom:32px;padding-top:32px}.sl-px-8{padding-left:32px;padding-right:32px}.sl-py-9{padding-bottom:36px;padding-top:36px}.sl-px-9{padding-left:36px;padding-right:36px}.sl-py-10{padding-bottom:40px;padding-top:40px}.sl-px-10{padding-left:40px;padding-right:40px}.sl-py-11{padding-bottom:44px;padding-top:44px}.sl-px-11{padding-left:44px;padding-right:44px}.sl-py-12{padding-bottom:48px;padding-top:48px}.sl-px-12{padding-left:48px;padding-right:48px}.sl-py-14{padding-bottom:56px;padding-top:56px}.sl-px-14{padding-left:56px;padding-right:56px}.sl-py-16{padding-bottom:64px;padding-top:64px}.sl-px-16{padding-left:64px;padding-right:64px}.sl-py-20{padding-bottom:80px;padding-top:80px}.sl-px-20{padding-left:80px;padding-right:80px}.sl-py-24{padding-bottom:96px;padding-top:96px}.sl-px-24{padding-left:96px;padding-right:96px}.sl-py-28{padding-bottom:112px;padding-top:112px}.sl-px-28{padding-left:112px;padding-right:112px}.sl-py-32{padding-bottom:128px;padding-top:128px}.sl-px-32{padding-left:128px;padding-right:128px}.sl-py-36{padding-bottom:144px;padding-top:144px}.sl-px-36{padding-left:144px;padding-right:144px}.sl-py-40{padding-bottom:160px;padding-top:160px}.sl-px-40{padding-left:160px;padding-right:160px}.sl-py-44{padding-bottom:176px;padding-top:176px}.sl-px-44{padding-left:176px;padding-right:176px}.sl-py-48{padding-bottom:192px;padding-top:192px}.sl-px-48{padding-left:192px;padding-right:192px}.sl-py-52{padding-bottom:208px;padding-top:208px}.sl-px-52{padding-left:208px;padding-right:208px}.sl-py-56{padding-bottom:224px;padding-top:224px}.sl-px-56{padding-left:224px;padding-right:224px}.sl-py-60{padding-bottom:240px;padding-top:240px}.sl-px-60{padding-left:240px;padding-right:240px}.sl-py-64{padding-bottom:256px;padding-top:256px}.sl-px-64{padding-left:256px;padding-right:256px}.sl-py-72{padding-bottom:288px;padding-top:288px}.sl-px-72{padding-left:288px;padding-right:288px}.sl-py-80{padding-bottom:320px;padding-top:320px}.sl-px-80{padding-left:320px;padding-right:320px}.sl-py-96{padding-bottom:384px;padding-top:384px}.sl-px-96{padding-left:384px;padding-right:384px}.sl-py-px{padding-bottom:1px;padding-top:1px}.sl-px-px{padding-left:1px;padding-right:1px}.sl-py-0\.5{padding-bottom:2px;padding-top:2px}.sl-px-0\.5{padding-left:2px;padding-right:2px}.sl-py-1\.5{padding-bottom:6px;padding-top:6px}.sl-px-1\.5{padding-left:6px;padding-right:6px}.sl-py-2\.5{padding-bottom:10px;padding-top:10px}.sl-px-2\.5{padding-left:10px;padding-right:10px}.sl-py-3\.5{padding-bottom:14px;padding-top:14px}.sl-px-3\.5{padding-left:14px;padding-right:14px}.sl-py-4\.5{padding-bottom:18px;padding-top:18px}.sl-px-4\.5{padding-left:18px;padding-right:18px}.sl-pt-0{padding-top:0}.sl-pr-0{padding-right:0}.sl-pb-0{padding-bottom:0}.sl-pl-0{padding-left:0}.sl-pt-1{padding-top:4px}.sl-pr-1{padding-right:4px}.sl-pb-1{padding-bottom:4px}.sl-pl-1{padding-left:4px}.sl-pt-2{padding-top:8px}.sl-pr-2{padding-right:8px}.sl-pb-2{padding-bottom:8px}.sl-pl-2{padding-left:8px}.sl-pt-3{padding-top:12px}.sl-pr-3{padding-right:12px}.sl-pb-3{padding-bottom:12px}.sl-pl-3{padding-left:12px}.sl-pt-4{padding-top:16px}.sl-pr-4{padding-right:16px}.sl-pb-4{padding-bottom:16px}.sl-pl-4{padding-left:16px}.sl-pt-5{padding-top:20px}.sl-pr-5{padding-right:20px}.sl-pb-5{padding-bottom:20px}.sl-pl-5{padding-left:20px}.sl-pt-6{padding-top:24px}.sl-pr-6{padding-right:24px}.sl-pb-6{padding-bottom:24px}.sl-pl-6{padding-left:24px}.sl-pt-7{padding-top:28px}.sl-pr-7{padding-right:28px}.sl-pb-7{padding-bottom:28px}.sl-pl-7{padding-left:28px}.sl-pt-8{padding-top:32px}.sl-pr-8{padding-right:32px}.sl-pb-8{padding-bottom:32px}.sl-pl-8{padding-left:32px}.sl-pt-9{padding-top:36px}.sl-pr-9{padding-right:36px}.sl-pb-9{padding-bottom:36px}.sl-pl-9{padding-left:36px}.sl-pt-10{padding-top:40px}.sl-pr-10{padding-right:40px}.sl-pb-10{padding-bottom:40px}.sl-pl-10{padding-left:40px}.sl-pt-11{padding-top:44px}.sl-pr-11{padding-right:44px}.sl-pb-11{padding-bottom:44px}.sl-pl-11{padding-left:44px}.sl-pt-12{padding-top:48px}.sl-pr-12{padding-right:48px}.sl-pb-12{padding-bottom:48px}.sl-pl-12{padding-left:48px}.sl-pt-14{padding-top:56px}.sl-pr-14{padding-right:56px}.sl-pb-14{padding-bottom:56px}.sl-pl-14{padding-left:56px}.sl-pt-16{padding-top:64px}.sl-pr-16{padding-right:64px}.sl-pb-16{padding-bottom:64px}.sl-pl-16{padding-left:64px}.sl-pt-20{padding-top:80px}.sl-pr-20{padding-right:80px}.sl-pb-20{padding-bottom:80px}.sl-pl-20{padding-left:80px}.sl-pt-24{padding-top:96px}.sl-pr-24{padding-right:96px}.sl-pb-24{padding-bottom:96px}.sl-pl-24{padding-left:96px}.sl-pt-28{padding-top:112px}.sl-pr-28{padding-right:112px}.sl-pb-28{padding-bottom:112px}.sl-pl-28{padding-left:112px}.sl-pt-32{padding-top:128px}.sl-pr-32{padding-right:128px}.sl-pb-32{padding-bottom:128px}.sl-pl-32{padding-left:128px}.sl-pt-36{padding-top:144px}.sl-pr-36{padding-right:144px}.sl-pb-36{padding-bottom:144px}.sl-pl-36{padding-left:144px}.sl-pt-40{padding-top:160px}.sl-pr-40{padding-right:160px}.sl-pb-40{padding-bottom:160px}.sl-pl-40{padding-left:160px}.sl-pt-44{padding-top:176px}.sl-pr-44{padding-right:176px}.sl-pb-44{padding-bottom:176px}.sl-pl-44{padding-left:176px}.sl-pt-48{padding-top:192px}.sl-pr-48{padding-right:192px}.sl-pb-48{padding-bottom:192px}.sl-pl-48{padding-left:192px}.sl-pt-52{padding-top:208px}.sl-pr-52{padding-right:208px}.sl-pb-52{padding-bottom:208px}.sl-pl-52{padding-left:208px}.sl-pt-56{padding-top:224px}.sl-pr-56{padding-right:224px}.sl-pb-56{padding-bottom:224px}.sl-pl-56{padding-left:224px}.sl-pt-60{padding-top:240px}.sl-pr-60{padding-right:240px}.sl-pb-60{padding-bottom:240px}.sl-pl-60{padding-left:240px}.sl-pt-64{padding-top:256px}.sl-pr-64{padding-right:256px}.sl-pb-64{padding-bottom:256px}.sl-pl-64{padding-left:256px}.sl-pt-72{padding-top:288px}.sl-pr-72{padding-right:288px}.sl-pb-72{padding-bottom:288px}.sl-pl-72{padding-left:288px}.sl-pt-80{padding-top:320px}.sl-pr-80{padding-right:320px}.sl-pb-80{padding-bottom:320px}.sl-pl-80{padding-left:320px}.sl-pt-96{padding-top:384px}.sl-pr-96{padding-right:384px}.sl-pb-96{padding-bottom:384px}.sl-pl-96{padding-left:384px}.sl-pt-px{padding-top:1px}.sl-pr-px{padding-right:1px}.sl-pb-px{padding-bottom:1px}.sl-pl-px{padding-left:1px}.sl-pt-0\.5{padding-top:2px}.sl-pr-0\.5{padding-right:2px}.sl-pb-0\.5{padding-bottom:2px}.sl-pl-0\.5{padding-left:2px}.sl-pt-1\.5{padding-top:6px}.sl-pr-1\.5{padding-right:6px}.sl-pb-1\.5{padding-bottom:6px}.sl-pl-1\.5{padding-left:6px}.sl-pt-2\.5{padding-top:10px}.sl-pr-2\.5{padding-right:10px}.sl-pb-2\.5{padding-bottom:10px}.sl-pl-2\.5{padding-left:10px}.sl-pt-3\.5{padding-top:14px}.sl-pr-3\.5{padding-right:14px}.sl-pb-3\.5{padding-bottom:14px}.sl-pl-3\.5{padding-left:14px}.sl-pt-4\.5{padding-top:18px}.sl-pr-4\.5{padding-right:18px}.sl-pb-4\.5{padding-bottom:18px}.sl-pl-4\.5{padding-left:18px}.sl-placeholder::-ms-input-placeholder{color:var(--color-text-light)}.sl-placeholder::placeholder{color:var(--color-text-light)}.sl-placeholder-primary::-ms-input-placeholder{color:#3898ff}.sl-placeholder-primary::placeholder{color:#3898ff}.sl-placeholder-success::-ms-input-placeholder{color:#0ea06f}.sl-placeholder-success::placeholder{color:#0ea06f}.sl-placeholder-warning::-ms-input-placeholder{color:#f3602b}.sl-placeholder-warning::placeholder{color:#f3602b}.sl-placeholder-danger::-ms-input-placeholder{color:#f05151}.sl-placeholder-danger::placeholder{color:#f05151}.sl-pointer-events-none{pointer-events:none}.sl-pointer-events-auto{pointer-events:auto}.sl-static{position:static}.sl-fixed{position:fixed}.sl-absolute{position:absolute}.sl-relative{position:relative}.sl-sticky{position:-webkit-sticky;position:sticky}.sl-resize-none{resize:none}.sl-resize-y{resize:vertical}.sl-resize-x{resize:horizontal}.sl-resize{resize:both}.sl-ring-primary{--tw-ring-color:hsla(var(--primary-h),80%,61%,var(--tw-ring-opacity)) }.sl-ring-success{--tw-ring-color:hsla(var(--success-h),84%,34%,var(--tw-ring-opacity)) }.sl-ring-warning{--tw-ring-color:hsla(var(--warning-h),89%,56%,var(--tw-ring-opacity)) }.sl-ring-danger{--tw-ring-color:hsla(var(--danger-h),84%,63%,var(--tw-ring-opacity)) }.focus\:sl-ring-primary:focus{--tw-ring-color:hsla(var(--primary-h),80%,61%,var(--tw-ring-opacity)) }.focus\:sl-ring-success:focus{--tw-ring-color:hsla(var(--success-h),84%,34%,var(--tw-ring-opacity)) }.focus\:sl-ring-warning:focus{--tw-ring-color:hsla(var(--warning-h),89%,56%,var(--tw-ring-opacity)) }.focus\:sl-ring-danger:focus{--tw-ring-color:hsla(var(--danger-h),84%,63%,var(--tw-ring-opacity)) }.sl-ring-opacity-0{--tw-ring-opacity:0}.sl-ring-opacity-5{--tw-ring-opacity:0.05}.sl-ring-opacity-10{--tw-ring-opacity:0.1}.sl-ring-opacity-20{--tw-ring-opacity:0.2}.sl-ring-opacity-30{--tw-ring-opacity:0.3}.sl-ring-opacity-40{--tw-ring-opacity:0.4}.sl-ring-opacity-50{--tw-ring-opacity:0.5}.sl-ring-opacity-60{--tw-ring-opacity:0.6}.sl-ring-opacity-70{--tw-ring-opacity:0.7}.sl-ring-opacity-90{--tw-ring-opacity:0.9}.sl-ring-opacity-100{--tw-ring-opacity:1}.focus\:sl-ring-opacity-0:focus{--tw-ring-opacity:0}.focus\:sl-ring-opacity-5:focus{--tw-ring-opacity:0.05}.focus\:sl-ring-opacity-10:focus{--tw-ring-opacity:0.1}.focus\:sl-ring-opacity-20:focus{--tw-ring-opacity:0.2}.focus\:sl-ring-opacity-30:focus{--tw-ring-opacity:0.3}.focus\:sl-ring-opacity-40:focus{--tw-ring-opacity:0.4}.focus\:sl-ring-opacity-50:focus{--tw-ring-opacity:0.5}.focus\:sl-ring-opacity-60:focus{--tw-ring-opacity:0.6}.focus\:sl-ring-opacity-70:focus{--tw-ring-opacity:0.7}.focus\:sl-ring-opacity-90:focus{--tw-ring-opacity:0.9}.focus\:sl-ring-opacity-100:focus{--tw-ring-opacity:1}*{--tw-ring-inset:var(--tw-empty,/*!*/ /*!*/);--tw-ring-offset-width:0px;--tw-ring-offset-color:#fff;--tw-ring-color:rgba(147,197,253,.5);--tw-ring-offset-shadow:0 0 #0000;--tw-ring-shadow:0 0 #0000}.sl-ring{--tw-ring-offset-shadow:var(--tw-ring-inset) 0 0 0 var(--tw-ring-offset-width) var(--tw-ring-offset-color);--tw-ring-shadow:var(--tw-ring-inset) 0 0 0 calc(3px + var(--tw-ring-offset-width)) var(--tw-ring-color);box-shadow:var(--tw-ring-offset-shadow),var(--tw-ring-shadow),var(--tw-shadow,0 0 #0000)}.sl-ring-inset{--tw-ring-inset:inset}.focus\:sl-ring:focus{--tw-ring-offset-shadow:var(--tw-ring-inset) 0 0 0 var(--tw-ring-offset-width) var(--tw-ring-offset-color);--tw-ring-shadow:var(--tw-ring-inset) 0 0 0 calc(3px + var(--tw-ring-offset-width)) var(--tw-ring-color);box-shadow:var(--tw-ring-offset-shadow),var(--tw-ring-shadow),var(--tw-shadow,0 0 #0000)}.focus\:sl-ring-inset:focus{--tw-ring-inset:inset}.sl-stroke-transparent{stroke:transparent}.sl-stroke-current{stroke:currentColor}.sl-stroke-lighten-100{stroke:var(--color-lighten-100)}.sl-stroke-darken-100{stroke:var(--color-darken-100)}.sl-stroke-primary{stroke:var(--color-primary)}.sl-stroke-primary-tint{stroke:var(--color-primary-tint)}.sl-stroke-primary-light{stroke:var(--color-primary-light)}.sl-stroke-primary-dark{stroke:var(--color-primary-dark)}.sl-stroke-primary-darker{stroke:var(--color-primary-darker)}.sl-stroke-success{stroke:var(--color-success)}.sl-stroke-success-tint{stroke:var(--color-success-tint)}.sl-stroke-success-light{stroke:var(--color-success-light)}.sl-stroke-success-dark{stroke:var(--color-success-dark)}.sl-stroke-success-darker{stroke:var(--color-success-darker)}.sl-stroke-warning{stroke:var(--color-warning)}.sl-stroke-warning-tint{stroke:var(--color-warning-tint)}.sl-stroke-warning-light{stroke:var(--color-warning-light)}.sl-stroke-warning-dark{stroke:var(--color-warning-dark)}.sl-stroke-warning-darker{stroke:var(--color-warning-darker)}.sl-stroke-danger{stroke:var(--color-danger)}.sl-stroke-danger-tint{stroke:var(--color-danger-tint)}.sl-stroke-danger-light{stroke:var(--color-danger-light)}.sl-stroke-danger-dark{stroke:var(--color-danger-dark)}.sl-stroke-danger-darker{stroke:var(--color-danger-darker)}.sl-stroke-code{stroke:var(--color-code)}.sl-stroke-on-code{stroke:var(--color-on-code)}.sl-stroke-on-primary{stroke:var(--color-on-primary)}.sl-stroke-on-success{stroke:var(--color-on-success)}.sl-stroke-on-warning{stroke:var(--color-on-warning)}.sl-stroke-on-danger{stroke:var(--color-on-danger)}.sl-stroke-text{stroke:var(--color-text)}.sl-table-auto{table-layout:auto}.sl-table-fixed{table-layout:fixed}.sl-text-left{text-align:left}.sl-text-center{text-align:center}.sl-text-right{text-align:right}.sl-text-justify{text-align:justify}.sl-text-transparent{color:transparent}.sl-text-current{color:currentColor}.sl-text-lighten-100{color:var(--color-lighten-100)}.sl-text-darken-100{color:var(--color-darken-100)}.sl-text-primary{color:var(--color-primary)}.sl-text-primary-tint{color:var(--color-primary-tint)}.sl-text-primary-light{color:var(--color-primary-light)}.sl-text-primary-dark{color:var(--color-primary-dark)}.sl-text-primary-darker{color:var(--color-primary-darker)}.sl-text-success{color:var(--color-success)}.sl-text-success-tint{color:var(--color-success-tint)}.sl-text-success-light{color:var(--color-success-light)}.sl-text-success-dark{color:var(--color-success-dark)}.sl-text-success-darker{color:var(--color-success-darker)}.sl-text-warning{color:var(--color-warning)}.sl-text-warning-tint{color:var(--color-warning-tint)}.sl-text-warning-light{color:var(--color-warning-light)}.sl-text-warning-dark{color:var(--color-warning-dark)}.sl-text-warning-darker{color:var(--color-warning-darker)}.sl-text-danger{color:var(--color-danger)}.sl-text-danger-tint{color:var(--color-danger-tint)}.sl-text-danger-light{color:var(--color-danger-light)}.sl-text-danger-dark{color:var(--color-danger-dark)}.sl-text-danger-darker{color:var(--color-danger-darker)}.sl-text-code{color:var(--color-code)}.sl-text-on-code{color:var(--color-on-code)}.sl-text-on-primary{color:var(--color-on-primary)}.sl-text-on-success{color:var(--color-on-success)}.sl-text-on-warning{color:var(--color-on-warning)}.sl-text-on-danger{color:var(--color-on-danger)}.sl-text-body{color:var(--color-text)}.sl-text-muted{color:var(--color-text-muted)}.sl-text-light{color:var(--color-text-light)}.sl-text-heading{color:var(--color-text-heading)}.sl-text-paragraph{color:var(--color-text-paragraph)}.sl-text-canvas-50{color:var(--color-canvas-50)}.sl-text-canvas-100{color:var(--color-canvas-100)}.sl-text-canvas-200{color:var(--color-canvas-200)}.sl-text-canvas-300{color:var(--color-canvas-300)}.sl-text-canvas-pure{color:var(--color-canvas-pure)}.sl-text-canvas{color:var(--color-canvas)}.sl-text-canvas-dialog{color:var(--color-canvas-dialog)}.sl-text-link{color:var(--color-link)}.sl-text-link-dark{color:var(--color-link-dark)}.hover\:sl-text-transparent:hover{color:transparent}.hover\:sl-text-current:hover{color:currentColor}.hover\:sl-text-lighten-100:hover{color:var(--color-lighten-100)}.hover\:sl-text-darken-100:hover{color:var(--color-darken-100)}.hover\:sl-text-primary:hover{color:var(--color-primary)}.hover\:sl-text-primary-tint:hover{color:var(--color-primary-tint)}.hover\:sl-text-primary-light:hover{color:var(--color-primary-light)}.hover\:sl-text-primary-dark:hover{color:var(--color-primary-dark)}.hover\:sl-text-primary-darker:hover{color:var(--color-primary-darker)}.hover\:sl-text-success:hover{color:var(--color-success)}.hover\:sl-text-success-tint:hover{color:var(--color-success-tint)}.hover\:sl-text-success-light:hover{color:var(--color-success-light)}.hover\:sl-text-success-dark:hover{color:var(--color-success-dark)}.hover\:sl-text-success-darker:hover{color:var(--color-success-darker)}.hover\:sl-text-warning:hover{color:var(--color-warning)}.hover\:sl-text-warning-tint:hover{color:var(--color-warning-tint)}.hover\:sl-text-warning-light:hover{color:var(--color-warning-light)}.hover\:sl-text-warning-dark:hover{color:var(--color-warning-dark)}.hover\:sl-text-warning-darker:hover{color:var(--color-warning-darker)}.hover\:sl-text-danger:hover{color:var(--color-danger)}.hover\:sl-text-danger-tint:hover{color:var(--color-danger-tint)}.hover\:sl-text-danger-light:hover{color:var(--color-danger-light)}.hover\:sl-text-danger-dark:hover{color:var(--color-danger-dark)}.hover\:sl-text-danger-darker:hover{color:var(--color-danger-darker)}.hover\:sl-text-code:hover{color:var(--color-code)}.hover\:sl-text-on-code:hover{color:var(--color-on-code)}.hover\:sl-text-on-primary:hover{color:var(--color-on-primary)}.hover\:sl-text-on-success:hover{color:var(--color-on-success)}.hover\:sl-text-on-warning:hover{color:var(--color-on-warning)}.hover\:sl-text-on-danger:hover{color:var(--color-on-danger)}.hover\:sl-text-body:hover{color:var(--color-text)}.hover\:sl-text-muted:hover{color:var(--color-text-muted)}.hover\:sl-text-light:hover{color:var(--color-text-light)}.hover\:sl-text-heading:hover{color:var(--color-text-heading)}.hover\:sl-text-paragraph:hover{color:var(--color-text-paragraph)}.hover\:sl-text-canvas-50:hover{color:var(--color-canvas-50)}.hover\:sl-text-canvas-100:hover{color:var(--color-canvas-100)}.hover\:sl-text-canvas-200:hover{color:var(--color-canvas-200)}.hover\:sl-text-canvas-300:hover{color:var(--color-canvas-300)}.hover\:sl-text-canvas-pure:hover{color:var(--color-canvas-pure)}.hover\:sl-text-canvas:hover{color:var(--color-canvas)}.hover\:sl-text-canvas-dialog:hover{color:var(--color-canvas-dialog)}.hover\:sl-text-link:hover{color:var(--color-link)}.hover\:sl-text-link-dark:hover{color:var(--color-link-dark)}.focus\:sl-text-transparent:focus{color:transparent}.focus\:sl-text-current:focus{color:currentColor}.focus\:sl-text-lighten-100:focus{color:var(--color-lighten-100)}.focus\:sl-text-darken-100:focus{color:var(--color-darken-100)}.focus\:sl-text-primary:focus{color:var(--color-primary)}.focus\:sl-text-primary-tint:focus{color:var(--color-primary-tint)}.focus\:sl-text-primary-light:focus{color:var(--color-primary-light)}.focus\:sl-text-primary-dark:focus{color:var(--color-primary-dark)}.focus\:sl-text-primary-darker:focus{color:var(--color-primary-darker)}.focus\:sl-text-success:focus{color:var(--color-success)}.focus\:sl-text-success-tint:focus{color:var(--color-success-tint)}.focus\:sl-text-success-light:focus{color:var(--color-success-light)}.focus\:sl-text-success-dark:focus{color:var(--color-success-dark)}.focus\:sl-text-success-darker:focus{color:var(--color-success-darker)}.focus\:sl-text-warning:focus{color:var(--color-warning)}.focus\:sl-text-warning-tint:focus{color:var(--color-warning-tint)}.focus\:sl-text-warning-light:focus{color:var(--color-warning-light)}.focus\:sl-text-warning-dark:focus{color:var(--color-warning-dark)}.focus\:sl-text-warning-darker:focus{color:var(--color-warning-darker)}.focus\:sl-text-danger:focus{color:var(--color-danger)}.focus\:sl-text-danger-tint:focus{color:var(--color-danger-tint)}.focus\:sl-text-danger-light:focus{color:var(--color-danger-light)}.focus\:sl-text-danger-dark:focus{color:var(--color-danger-dark)}.focus\:sl-text-danger-darker:focus{color:var(--color-danger-darker)}.focus\:sl-text-code:focus{color:var(--color-code)}.focus\:sl-text-on-code:focus{color:var(--color-on-code)}.focus\:sl-text-on-primary:focus{color:var(--color-on-primary)}.focus\:sl-text-on-success:focus{color:var(--color-on-success)}.focus\:sl-text-on-warning:focus{color:var(--color-on-warning)}.focus\:sl-text-on-danger:focus{color:var(--color-on-danger)}.focus\:sl-text-body:focus{color:var(--color-text)}.focus\:sl-text-muted:focus{color:var(--color-text-muted)}.focus\:sl-text-light:focus{color:var(--color-text-light)}.focus\:sl-text-heading:focus{color:var(--color-text-heading)}.focus\:sl-text-paragraph:focus{color:var(--color-text-paragraph)}.focus\:sl-text-canvas-50:focus{color:var(--color-canvas-50)}.focus\:sl-text-canvas-100:focus{color:var(--color-canvas-100)}.focus\:sl-text-canvas-200:focus{color:var(--color-canvas-200)}.focus\:sl-text-canvas-300:focus{color:var(--color-canvas-300)}.focus\:sl-text-canvas-pure:focus{color:var(--color-canvas-pure)}.focus\:sl-text-canvas:focus{color:var(--color-canvas)}.focus\:sl-text-canvas-dialog:focus{color:var(--color-canvas-dialog)}.focus\:sl-text-link:focus{color:var(--color-link)}.focus\:sl-text-link-dark:focus{color:var(--color-link-dark)}.disabled\:sl-text-transparent:disabled{color:transparent}.disabled\:sl-text-current:disabled{color:currentColor}.disabled\:sl-text-lighten-100:disabled{color:var(--color-lighten-100)}.disabled\:sl-text-darken-100:disabled{color:var(--color-darken-100)}.disabled\:sl-text-primary:disabled{color:var(--color-primary)}.disabled\:sl-text-primary-tint:disabled{color:var(--color-primary-tint)}.disabled\:sl-text-primary-light:disabled{color:var(--color-primary-light)}.disabled\:sl-text-primary-dark:disabled{color:var(--color-primary-dark)}.disabled\:sl-text-primary-darker:disabled{color:var(--color-primary-darker)}.disabled\:sl-text-success:disabled{color:var(--color-success)}.disabled\:sl-text-success-tint:disabled{color:var(--color-success-tint)}.disabled\:sl-text-success-light:disabled{color:var(--color-success-light)}.disabled\:sl-text-success-dark:disabled{color:var(--color-success-dark)}.disabled\:sl-text-success-darker:disabled{color:var(--color-success-darker)}.disabled\:sl-text-warning:disabled{color:var(--color-warning)}.disabled\:sl-text-warning-tint:disabled{color:var(--color-warning-tint)}.disabled\:sl-text-warning-light:disabled{color:var(--color-warning-light)}.disabled\:sl-text-warning-dark:disabled{color:var(--color-warning-dark)}.disabled\:sl-text-warning-darker:disabled{color:var(--color-warning-darker)}.disabled\:sl-text-danger:disabled{color:var(--color-danger)}.disabled\:sl-text-danger-tint:disabled{color:var(--color-danger-tint)}.disabled\:sl-text-danger-light:disabled{color:var(--color-danger-light)}.disabled\:sl-text-danger-dark:disabled{color:var(--color-danger-dark)}.disabled\:sl-text-danger-darker:disabled{color:var(--color-danger-darker)}.disabled\:sl-text-code:disabled{color:var(--color-code)}.disabled\:sl-text-on-code:disabled{color:var(--color-on-code)}.disabled\:sl-text-on-primary:disabled{color:var(--color-on-primary)}.disabled\:sl-text-on-success:disabled{color:var(--color-on-success)}.disabled\:sl-text-on-warning:disabled{color:var(--color-on-warning)}.disabled\:sl-text-on-danger:disabled{color:var(--color-on-danger)}.disabled\:sl-text-body:disabled{color:var(--color-text)}.disabled\:sl-text-muted:disabled{color:var(--color-text-muted)}.disabled\:sl-text-light:disabled{color:var(--color-text-light)}.disabled\:sl-text-heading:disabled{color:var(--color-text-heading)}.disabled\:sl-text-paragraph:disabled{color:var(--color-text-paragraph)}.disabled\:sl-text-canvas-50:disabled{color:var(--color-canvas-50)}.disabled\:sl-text-canvas-100:disabled{color:var(--color-canvas-100)}.disabled\:sl-text-canvas-200:disabled{color:var(--color-canvas-200)}.disabled\:sl-text-canvas-300:disabled{color:var(--color-canvas-300)}.disabled\:sl-text-canvas-pure:disabled{color:var(--color-canvas-pure)}.disabled\:sl-text-canvas:disabled{color:var(--color-canvas)}.disabled\:sl-text-canvas-dialog:disabled{color:var(--color-canvas-dialog)}.disabled\:sl-text-link:disabled{color:var(--color-link)}.disabled\:sl-text-link-dark:disabled{color:var(--color-link-dark)}.sl-underline{text-decoration:underline}.sl-line-through{text-decoration:line-through}.sl-no-underline{text-decoration:none}.hover\:sl-underline:hover{text-decoration:underline}.hover\:sl-line-through:hover{text-decoration:line-through}.hover\:sl-no-underline:hover{text-decoration:none}.sl-truncate{overflow:hidden;white-space:nowrap}.sl-overflow-ellipsis,.sl-truncate{text-overflow:ellipsis}.sl-overflow-clip{text-overflow:clip}.sl-uppercase{text-transform:uppercase}.sl-lowercase{text-transform:lowercase}.sl-capitalize{text-transform:capitalize}.sl-normal-case{text-transform:none}.sl-transform{transform:translateX(var(--tw-translate-x)) translateY(var(--tw-translate-y)) rotate(var(--tw-rotate)) skewX(var(--tw-skew-x)) skewY(var(--tw-skew-y)) scaleX(var(--tw-scale-x)) scaleY(var(--tw-scale-y))}.sl-transform,.sl-transform-gpu{--tw-translate-x:0;--tw-translate-y:0;--tw-rotate:0;--tw-skew-x:0;--tw-skew-y:0;--tw-scale-x:1;--tw-scale-y:1}.sl-transform-gpu{transform:translate3d(var(--tw-translate-x),var(--tw-translate-y),0) rotate(var(--tw-rotate)) skewX(var(--tw-skew-x)) skewY(var(--tw-skew-y)) scaleX(var(--tw-scale-x)) scaleY(var(--tw-scale-y))}.sl-transform-none{transform:none}.sl-delay-75{transition-delay:75ms}.sl-delay-150{transition-delay:.15s}.sl-delay-300{transition-delay:.3s}.sl-delay-500{transition-delay:.5s}.sl-delay-1000{transition-delay:1s}.sl-duration-75{transition-duration:75ms}.sl-duration-150{transition-duration:.15s}.sl-duration-300{transition-duration:.3s}.sl-duration-500{transition-duration:.5s}.sl-duration-1000{transition-duration:1s}.sl-transition{transition-duration:.15s;transition-property:background-color,border-color,color,fill,stroke,opacity,box-shadow,transform;transition-timing-function:cubic-bezier(.4,0,.2,1)}.sl-translate-x-0{--tw-translate-x:0px}.sl-translate-x-1{--tw-translate-x:4px}.sl-translate-x-2{--tw-translate-x:8px}.sl-translate-x-3{--tw-translate-x:12px}.sl-translate-x-4{--tw-translate-x:16px}.sl-translate-x-5{--tw-translate-x:20px}.sl-translate-x-6{--tw-translate-x:24px}.sl-translate-x-7{--tw-translate-x:28px}.sl-translate-x-8{--tw-translate-x:32px}.sl-translate-x-9{--tw-translate-x:36px}.sl-translate-x-10{--tw-translate-x:40px}.sl-translate-x-11{--tw-translate-x:44px}.sl-translate-x-12{--tw-translate-x:48px}.sl-translate-x-14{--tw-translate-x:56px}.sl-translate-x-16{--tw-translate-x:64px}.sl-translate-x-20{--tw-translate-x:80px}.sl-translate-x-24{--tw-translate-x:96px}.sl-translate-x-28{--tw-translate-x:112px}.sl-translate-x-32{--tw-translate-x:128px}.sl-translate-x-36{--tw-translate-x:144px}.sl-translate-x-40{--tw-translate-x:160px}.sl-translate-x-44{--tw-translate-x:176px}.sl-translate-x-48{--tw-translate-x:192px}.sl-translate-x-52{--tw-translate-x:208px}.sl-translate-x-56{--tw-translate-x:224px}.sl-translate-x-60{--tw-translate-x:240px}.sl-translate-x-64{--tw-translate-x:256px}.sl-translate-x-72{--tw-translate-x:288px}.sl-translate-x-80{--tw-translate-x:320px}.sl-translate-x-96{--tw-translate-x:384px}.sl-translate-x-px{--tw-translate-x:1px}.sl-translate-x-0\.5{--tw-translate-x:2px}.sl-translate-x-1\.5{--tw-translate-x:6px}.sl-translate-x-2\.5{--tw-translate-x:10px}.sl-translate-x-3\.5{--tw-translate-x:14px}.sl-translate-x-4\.5{--tw-translate-x:18px}.sl--translate-x-0{--tw-translate-x:0px}.sl--translate-x-1{--tw-translate-x:-4px}.sl--translate-x-2{--tw-translate-x:-8px}.sl--translate-x-3{--tw-translate-x:-12px}.sl--translate-x-4{--tw-translate-x:-16px}.sl--translate-x-5{--tw-translate-x:-20px}.sl--translate-x-6{--tw-translate-x:-24px}.sl--translate-x-7{--tw-translate-x:-28px}.sl--translate-x-8{--tw-translate-x:-32px}.sl--translate-x-9{--tw-translate-x:-36px}.sl--translate-x-10{--tw-translate-x:-40px}.sl--translate-x-11{--tw-translate-x:-44px}.sl--translate-x-12{--tw-translate-x:-48px}.sl--translate-x-14{--tw-translate-x:-56px}.sl--translate-x-16{--tw-translate-x:-64px}.sl--translate-x-20{--tw-translate-x:-80px}.sl--translate-x-24{--tw-translate-x:-96px}.sl--translate-x-28{--tw-translate-x:-112px}.sl--translate-x-32{--tw-translate-x:-128px}.sl--translate-x-36{--tw-translate-x:-144px}.sl--translate-x-40{--tw-translate-x:-160px}.sl--translate-x-44{--tw-translate-x:-176px}.sl--translate-x-48{--tw-translate-x:-192px}.sl--translate-x-52{--tw-translate-x:-208px}.sl--translate-x-56{--tw-translate-x:-224px}.sl--translate-x-60{--tw-translate-x:-240px}.sl--translate-x-64{--tw-translate-x:-256px}.sl--translate-x-72{--tw-translate-x:-288px}.sl--translate-x-80{--tw-translate-x:-320px}.sl--translate-x-96{--tw-translate-x:-384px}.sl--translate-x-px{--tw-translate-x:-1px}.sl--translate-x-0\.5{--tw-translate-x:-2px}.sl--translate-x-1\.5{--tw-translate-x:-6px}.sl--translate-x-2\.5{--tw-translate-x:-10px}.sl--translate-x-3\.5{--tw-translate-x:-14px}.sl--translate-x-4\.5{--tw-translate-x:-18px}.sl-translate-y-0{--tw-translate-y:0px}.sl-translate-y-1{--tw-translate-y:4px}.sl-translate-y-2{--tw-translate-y:8px}.sl-translate-y-3{--tw-translate-y:12px}.sl-translate-y-4{--tw-translate-y:16px}.sl-translate-y-5{--tw-translate-y:20px}.sl-translate-y-6{--tw-translate-y:24px}.sl-translate-y-7{--tw-translate-y:28px}.sl-translate-y-8{--tw-translate-y:32px}.sl-translate-y-9{--tw-translate-y:36px}.sl-translate-y-10{--tw-translate-y:40px}.sl-translate-y-11{--tw-translate-y:44px}.sl-translate-y-12{--tw-translate-y:48px}.sl-translate-y-14{--tw-translate-y:56px}.sl-translate-y-16{--tw-translate-y:64px}.sl-translate-y-20{--tw-translate-y:80px}.sl-translate-y-24{--tw-translate-y:96px}.sl-translate-y-28{--tw-translate-y:112px}.sl-translate-y-32{--tw-translate-y:128px}.sl-translate-y-36{--tw-translate-y:144px}.sl-translate-y-40{--tw-translate-y:160px}.sl-translate-y-44{--tw-translate-y:176px}.sl-translate-y-48{--tw-translate-y:192px}.sl-translate-y-52{--tw-translate-y:208px}.sl-translate-y-56{--tw-translate-y:224px}.sl-translate-y-60{--tw-translate-y:240px}.sl-translate-y-64{--tw-translate-y:256px}.sl-translate-y-72{--tw-translate-y:288px}.sl-translate-y-80{--tw-translate-y:320px}.sl-translate-y-96{--tw-translate-y:384px}.sl-translate-y-px{--tw-translate-y:1px}.sl-translate-y-0\.5{--tw-translate-y:2px}.sl-translate-y-1\.5{--tw-translate-y:6px}.sl-translate-y-2\.5{--tw-translate-y:10px}.sl-translate-y-3\.5{--tw-translate-y:14px}.sl-translate-y-4\.5{--tw-translate-y:18px}.sl--translate-y-0{--tw-translate-y:0px}.sl--translate-y-1{--tw-translate-y:-4px}.sl--translate-y-2{--tw-translate-y:-8px}.sl--translate-y-3{--tw-translate-y:-12px}.sl--translate-y-4{--tw-translate-y:-16px}.sl--translate-y-5{--tw-translate-y:-20px}.sl--translate-y-6{--tw-translate-y:-24px}.sl--translate-y-7{--tw-translate-y:-28px}.sl--translate-y-8{--tw-translate-y:-32px}.sl--translate-y-9{--tw-translate-y:-36px}.sl--translate-y-10{--tw-translate-y:-40px}.sl--translate-y-11{--tw-translate-y:-44px}.sl--translate-y-12{--tw-translate-y:-48px}.sl--translate-y-14{--tw-translate-y:-56px}.sl--translate-y-16{--tw-translate-y:-64px}.sl--translate-y-20{--tw-translate-y:-80px}.sl--translate-y-24{--tw-translate-y:-96px}.sl--translate-y-28{--tw-translate-y:-112px}.sl--translate-y-32{--tw-translate-y:-128px}.sl--translate-y-36{--tw-translate-y:-144px}.sl--translate-y-40{--tw-translate-y:-160px}.sl--translate-y-44{--tw-translate-y:-176px}.sl--translate-y-48{--tw-translate-y:-192px}.sl--translate-y-52{--tw-translate-y:-208px}.sl--translate-y-56{--tw-translate-y:-224px}.sl--translate-y-60{--tw-translate-y:-240px}.sl--translate-y-64{--tw-translate-y:-256px}.sl--translate-y-72{--tw-translate-y:-288px}.sl--translate-y-80{--tw-translate-y:-320px}.sl--translate-y-96{--tw-translate-y:-384px}.sl--translate-y-px{--tw-translate-y:-1px}.sl--translate-y-0\.5{--tw-translate-y:-2px}.sl--translate-y-1\.5{--tw-translate-y:-6px}.sl--translate-y-2\.5{--tw-translate-y:-10px}.sl--translate-y-3\.5{--tw-translate-y:-14px}.sl--translate-y-4\.5{--tw-translate-y:-18px}.hover\:sl-translate-x-0:hover{--tw-translate-x:0px}.hover\:sl-translate-x-1:hover{--tw-translate-x:4px}.hover\:sl-translate-x-2:hover{--tw-translate-x:8px}.hover\:sl-translate-x-3:hover{--tw-translate-x:12px}.hover\:sl-translate-x-4:hover{--tw-translate-x:16px}.hover\:sl-translate-x-5:hover{--tw-translate-x:20px}.hover\:sl-translate-x-6:hover{--tw-translate-x:24px}.hover\:sl-translate-x-7:hover{--tw-translate-x:28px}.hover\:sl-translate-x-8:hover{--tw-translate-x:32px}.hover\:sl-translate-x-9:hover{--tw-translate-x:36px}.hover\:sl-translate-x-10:hover{--tw-translate-x:40px}.hover\:sl-translate-x-11:hover{--tw-translate-x:44px}.hover\:sl-translate-x-12:hover{--tw-translate-x:48px}.hover\:sl-translate-x-14:hover{--tw-translate-x:56px}.hover\:sl-translate-x-16:hover{--tw-translate-x:64px}.hover\:sl-translate-x-20:hover{--tw-translate-x:80px}.hover\:sl-translate-x-24:hover{--tw-translate-x:96px}.hover\:sl-translate-x-28:hover{--tw-translate-x:112px}.hover\:sl-translate-x-32:hover{--tw-translate-x:128px}.hover\:sl-translate-x-36:hover{--tw-translate-x:144px}.hover\:sl-translate-x-40:hover{--tw-translate-x:160px}.hover\:sl-translate-x-44:hover{--tw-translate-x:176px}.hover\:sl-translate-x-48:hover{--tw-translate-x:192px}.hover\:sl-translate-x-52:hover{--tw-translate-x:208px}.hover\:sl-translate-x-56:hover{--tw-translate-x:224px}.hover\:sl-translate-x-60:hover{--tw-translate-x:240px}.hover\:sl-translate-x-64:hover{--tw-translate-x:256px}.hover\:sl-translate-x-72:hover{--tw-translate-x:288px}.hover\:sl-translate-x-80:hover{--tw-translate-x:320px}.hover\:sl-translate-x-96:hover{--tw-translate-x:384px}.hover\:sl-translate-x-px:hover{--tw-translate-x:1px}.hover\:sl-translate-x-0\.5:hover{--tw-translate-x:2px}.hover\:sl-translate-x-1\.5:hover{--tw-translate-x:6px}.hover\:sl-translate-x-2\.5:hover{--tw-translate-x:10px}.hover\:sl-translate-x-3\.5:hover{--tw-translate-x:14px}.hover\:sl-translate-x-4\.5:hover{--tw-translate-x:18px}.hover\:sl--translate-x-0:hover{--tw-translate-x:0px}.hover\:sl--translate-x-1:hover{--tw-translate-x:-4px}.hover\:sl--translate-x-2:hover{--tw-translate-x:-8px}.hover\:sl--translate-x-3:hover{--tw-translate-x:-12px}.hover\:sl--translate-x-4:hover{--tw-translate-x:-16px}.hover\:sl--translate-x-5:hover{--tw-translate-x:-20px}.hover\:sl--translate-x-6:hover{--tw-translate-x:-24px}.hover\:sl--translate-x-7:hover{--tw-translate-x:-28px}.hover\:sl--translate-x-8:hover{--tw-translate-x:-32px}.hover\:sl--translate-x-9:hover{--tw-translate-x:-36px}.hover\:sl--translate-x-10:hover{--tw-translate-x:-40px}.hover\:sl--translate-x-11:hover{--tw-translate-x:-44px}.hover\:sl--translate-x-12:hover{--tw-translate-x:-48px}.hover\:sl--translate-x-14:hover{--tw-translate-x:-56px}.hover\:sl--translate-x-16:hover{--tw-translate-x:-64px}.hover\:sl--translate-x-20:hover{--tw-translate-x:-80px}.hover\:sl--translate-x-24:hover{--tw-translate-x:-96px}.hover\:sl--translate-x-28:hover{--tw-translate-x:-112px}.hover\:sl--translate-x-32:hover{--tw-translate-x:-128px}.hover\:sl--translate-x-36:hover{--tw-translate-x:-144px}.hover\:sl--translate-x-40:hover{--tw-translate-x:-160px}.hover\:sl--translate-x-44:hover{--tw-translate-x:-176px}.hover\:sl--translate-x-48:hover{--tw-translate-x:-192px}.hover\:sl--translate-x-52:hover{--tw-translate-x:-208px}.hover\:sl--translate-x-56:hover{--tw-translate-x:-224px}.hover\:sl--translate-x-60:hover{--tw-translate-x:-240px}.hover\:sl--translate-x-64:hover{--tw-translate-x:-256px}.hover\:sl--translate-x-72:hover{--tw-translate-x:-288px}.hover\:sl--translate-x-80:hover{--tw-translate-x:-320px}.hover\:sl--translate-x-96:hover{--tw-translate-x:-384px}.hover\:sl--translate-x-px:hover{--tw-translate-x:-1px}.hover\:sl--translate-x-0\.5:hover{--tw-translate-x:-2px}.hover\:sl--translate-x-1\.5:hover{--tw-translate-x:-6px}.hover\:sl--translate-x-2\.5:hover{--tw-translate-x:-10px}.hover\:sl--translate-x-3\.5:hover{--tw-translate-x:-14px}.hover\:sl--translate-x-4\.5:hover{--tw-translate-x:-18px}.hover\:sl-translate-y-0:hover{--tw-translate-y:0px}.hover\:sl-translate-y-1:hover{--tw-translate-y:4px}.hover\:sl-translate-y-2:hover{--tw-translate-y:8px}.hover\:sl-translate-y-3:hover{--tw-translate-y:12px}.hover\:sl-translate-y-4:hover{--tw-translate-y:16px}.hover\:sl-translate-y-5:hover{--tw-translate-y:20px}.hover\:sl-translate-y-6:hover{--tw-translate-y:24px}.hover\:sl-translate-y-7:hover{--tw-translate-y:28px}.hover\:sl-translate-y-8:hover{--tw-translate-y:32px}.hover\:sl-translate-y-9:hover{--tw-translate-y:36px}.hover\:sl-translate-y-10:hover{--tw-translate-y:40px}.hover\:sl-translate-y-11:hover{--tw-translate-y:44px}.hover\:sl-translate-y-12:hover{--tw-translate-y:48px}.hover\:sl-translate-y-14:hover{--tw-translate-y:56px}.hover\:sl-translate-y-16:hover{--tw-translate-y:64px}.hover\:sl-translate-y-20:hover{--tw-translate-y:80px}.hover\:sl-translate-y-24:hover{--tw-translate-y:96px}.hover\:sl-translate-y-28:hover{--tw-translate-y:112px}.hover\:sl-translate-y-32:hover{--tw-translate-y:128px}.hover\:sl-translate-y-36:hover{--tw-translate-y:144px}.hover\:sl-translate-y-40:hover{--tw-translate-y:160px}.hover\:sl-translate-y-44:hover{--tw-translate-y:176px}.hover\:sl-translate-y-48:hover{--tw-translate-y:192px}.hover\:sl-translate-y-52:hover{--tw-translate-y:208px}.hover\:sl-translate-y-56:hover{--tw-translate-y:224px}.hover\:sl-translate-y-60:hover{--tw-translate-y:240px}.hover\:sl-translate-y-64:hover{--tw-translate-y:256px}.hover\:sl-translate-y-72:hover{--tw-translate-y:288px}.hover\:sl-translate-y-80:hover{--tw-translate-y:320px}.hover\:sl-translate-y-96:hover{--tw-translate-y:384px}.hover\:sl-translate-y-px:hover{--tw-translate-y:1px}.hover\:sl-translate-y-0\.5:hover{--tw-translate-y:2px}.hover\:sl-translate-y-1\.5:hover{--tw-translate-y:6px}.hover\:sl-translate-y-2\.5:hover{--tw-translate-y:10px}.hover\:sl-translate-y-3\.5:hover{--tw-translate-y:14px}.hover\:sl-translate-y-4\.5:hover{--tw-translate-y:18px}.hover\:sl--translate-y-0:hover{--tw-translate-y:0px}.hover\:sl--translate-y-1:hover{--tw-translate-y:-4px}.hover\:sl--translate-y-2:hover{--tw-translate-y:-8px}.hover\:sl--translate-y-3:hover{--tw-translate-y:-12px}.hover\:sl--translate-y-4:hover{--tw-translate-y:-16px}.hover\:sl--translate-y-5:hover{--tw-translate-y:-20px}.hover\:sl--translate-y-6:hover{--tw-translate-y:-24px}.hover\:sl--translate-y-7:hover{--tw-translate-y:-28px}.hover\:sl--translate-y-8:hover{--tw-translate-y:-32px}.hover\:sl--translate-y-9:hover{--tw-translate-y:-36px}.hover\:sl--translate-y-10:hover{--tw-translate-y:-40px}.hover\:sl--translate-y-11:hover{--tw-translate-y:-44px}.hover\:sl--translate-y-12:hover{--tw-translate-y:-48px}.hover\:sl--translate-y-14:hover{--tw-translate-y:-56px}.hover\:sl--translate-y-16:hover{--tw-translate-y:-64px}.hover\:sl--translate-y-20:hover{--tw-translate-y:-80px}.hover\:sl--translate-y-24:hover{--tw-translate-y:-96px}.hover\:sl--translate-y-28:hover{--tw-translate-y:-112px}.hover\:sl--translate-y-32:hover{--tw-translate-y:-128px}.hover\:sl--translate-y-36:hover{--tw-translate-y:-144px}.hover\:sl--translate-y-40:hover{--tw-translate-y:-160px}.hover\:sl--translate-y-44:hover{--tw-translate-y:-176px}.hover\:sl--translate-y-48:hover{--tw-translate-y:-192px}.hover\:sl--translate-y-52:hover{--tw-translate-y:-208px}.hover\:sl--translate-y-56:hover{--tw-translate-y:-224px}.hover\:sl--translate-y-60:hover{--tw-translate-y:-240px}.hover\:sl--translate-y-64:hover{--tw-translate-y:-256px}.hover\:sl--translate-y-72:hover{--tw-translate-y:-288px}.hover\:sl--translate-y-80:hover{--tw-translate-y:-320px}.hover\:sl--translate-y-96:hover{--tw-translate-y:-384px}.hover\:sl--translate-y-px:hover{--tw-translate-y:-1px}.hover\:sl--translate-y-0\.5:hover{--tw-translate-y:-2px}.hover\:sl--translate-y-1\.5:hover{--tw-translate-y:-6px}.hover\:sl--translate-y-2\.5:hover{--tw-translate-y:-10px}.hover\:sl--translate-y-3\.5:hover{--tw-translate-y:-14px}.hover\:sl--translate-y-4\.5:hover{--tw-translate-y:-18px}.sl-select-none{-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none}.sl-select-text{-webkit-user-select:text;-moz-user-select:text;-ms-user-select:text;user-select:text}.sl-select-all{-webkit-user-select:all;-moz-user-select:all;user-select:all}.sl-select-auto{-webkit-user-select:auto;-moz-user-select:auto;-ms-user-select:auto;user-select:auto}.sl-align-baseline{vertical-align:baseline}.sl-align-top{vertical-align:top}.sl-align-middle{vertical-align:middle}.sl-align-bottom{vertical-align:bottom}.sl-align-text-top{vertical-align:text-top}.sl-align-text-bottom{vertical-align:text-bottom}.sl-visible{visibility:visible}.sl-invisible{visibility:hidden}.sl-group:hover .group-hover\:sl-visible{visibility:visible}.sl-group:hover .group-hover\:sl-invisible{visibility:hidden}.sl-group:focus .group-focus\:sl-visible{visibility:visible}.sl-group:focus .group-focus\:sl-invisible{visibility:hidden}.sl-whitespace-normal{white-space:normal}.sl-whitespace-nowrap{white-space:nowrap}.sl-whitespace-pre{white-space:pre}.sl-whitespace-pre-line{white-space:pre-line}.sl-whitespace-pre-wrap{white-space:pre-wrap}.sl-w-0{width:0}.sl-w-1{width:4px}.sl-w-2{width:8px}.sl-w-3{width:12px}.sl-w-4{width:16px}.sl-w-5{width:20px}.sl-w-6{width:24px}.sl-w-7{width:28px}.sl-w-8{width:32px}.sl-w-9{width:36px}.sl-w-10{width:40px}.sl-w-11{width:44px}.sl-w-12{width:48px}.sl-w-14{width:56px}.sl-w-16{width:64px}.sl-w-20{width:80px}.sl-w-24{width:96px}.sl-w-28{width:112px}.sl-w-32{width:128px}.sl-w-36{width:144px}.sl-w-40{width:160px}.sl-w-44{width:176px}.sl-w-48{width:192px}.sl-w-52{width:208px}.sl-w-56{width:224px}.sl-w-60{width:240px}.sl-w-64{width:256px}.sl-w-72{width:288px}.sl-w-80{width:320px}.sl-w-96{width:384px}.sl-w-auto{width:auto}.sl-w-px{width:1px}.sl-w-0\.5{width:2px}.sl-w-1\.5{width:6px}.sl-w-2\.5{width:10px}.sl-w-3\.5{width:14px}.sl-w-4\.5{width:18px}.sl-w-xs{width:20px}.sl-w-sm{width:24px}.sl-w-md{width:32px}.sl-w-lg{width:36px}.sl-w-xl{width:44px}.sl-w-2xl{width:52px}.sl-w-3xl{width:60px}.sl-w-1\/2{width:50%}.sl-w-1\/3{width:33.333333%}.sl-w-2\/3{width:66.666667%}.sl-w-1\/4{width:25%}.sl-w-2\/4{width:50%}.sl-w-3\/4{width:75%}.sl-w-1\/5{width:20%}.sl-w-2\/5{width:40%}.sl-w-3\/5{width:60%}.sl-w-4\/5{width:80%}.sl-w-1\/6{width:16.666667%}.sl-w-2\/6{width:33.333333%}.sl-w-3\/6{width:50%}.sl-w-4\/6{width:66.666667%}.sl-w-5\/6{width:83.333333%}.sl-w-full{width:100%}.sl-w-screen{width:100vw}.sl-w-min{width:-moz-min-content;width:min-content}.sl-w-max{width:-moz-max-content;width:max-content}.sl-break-normal{overflow-wrap:normal;word-break:normal}.sl-break-words{overflow-wrap:break-word}.sl-break-all{word-break:break-all}.sl-z-0{z-index:0}.sl-z-10{z-index:10}.sl-z-20{z-index:20}.sl-z-30{z-index:30}.sl-z-40{z-index:40}.sl-z-50{z-index:50}.sl-z-auto{z-index:auto}.focus\:sl-z-0:focus{z-index:0}.focus\:sl-z-10:focus{z-index:10}.focus\:sl-z-20:focus{z-index:20}.focus\:sl-z-30:focus{z-index:30}.focus\:sl-z-40:focus{z-index:40}.focus\:sl-z-50:focus{z-index:50}.focus\:sl-z-auto:focus{z-index:auto}:root{--font-prose:ui-sans-serif,system-ui,-apple-system,BlinkMacSystemFont,"Segoe UI",Roboto,"Helvetica Neue",Arial,"Noto Sans",sans-serif,"Apple Color Emoji","Segoe UI Emoji","Segoe UI Symbol","Noto Color Emoji";--font-ui:Inter,ui-sans-serif,system-ui,-apple-system,BlinkMacSystemFont,"Segoe UI",Roboto,"Helvetica Neue",Arial,"Noto Sans",sans-serif,"Apple Color Emoji","Segoe UI Emoji","Segoe UI Symbol","Noto Color Emoji";--font-mono:"SF Mono",ui-monospace,Menlo,Monaco,Consolas,"Liberation Mono","Courier New",monospace;--font-code:var(--font-mono);--fs-paragraph-leading:22px;--fs-paragraph:16px;--fs-code:14px;--fs-paragraph-small:14px;--fs-paragraph-tiny:12px;--lh-paragraph-leading:1.875;--lh-paragraph:1.625;--lh-code:1.5;--lh-paragraph-small:1.625;--lh-paragraph-tiny:1.625;--color-code:var(--color-canvas-tint);--color-on-code:var(--color-text-heading)}.sl-avatar--with-bg:before{background-color:var(--avatar-bg-color);bottom:0;content:" ";left:0;opacity:var(--avatar-bg-opacity);position:absolute;right:0;top:0}.sl-aspect-ratio:before{content:"";display:block;height:0;padding-bottom:calc(1/var(--ratio)*100%)}.sl-aspect-ratio>:not(style){align-items:center;bottom:0;display:flex;height:100%;justify-content:center;left:0;overflow:hidden;position:absolute;right:0;top:0;width:100%}.sl-aspect-ratio>img,.sl-aspect-ratio>video{object-fit:cover}.sl-badge{align-items:center;border-width:1px;display:inline-flex;outline:2px solid transparent;outline-offset:2px}.sl-badge a{color:var(--color-text-muted)}.sl-badge a:hover{color:var(--color-text);cursor:pointer}.sl-button{align-items:center;display:inline-flex;line-height:0;outline:2px solid transparent;outline-offset:2px;overflow:hidden;text-overflow:ellipsis;white-space:nowrap}.sl-button-group>.sl-button:not(:first-child):not(:last-child){border-radius:0;border-right:0}.sl-button-group>.sl-button:first-child:not(:last-child){border-bottom-right-radius:0;border-right:0;border-top-right-radius:0}.sl-button-group>.sl-button:last-child:not(:first-child){border-bottom-left-radius:0;border-top-left-radius:0}.sl-image--inverted{filter:invert(1) hue-rotate(180deg);mix-blend-mode:screen}.Link,.Link>code{color:var(--color-link)}.Link:hover,.Link:hover>code{color:var(--color-link-dark)}.sl-link-heading:hover .sl-link-heading__icon{opacity:1}.sl-link-heading__icon{opacity:0}.sl-menu{-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none}.sl-menu--pointer-interactions .sl-menu-item:not(.sl-menu-item--disabled):hover{background-color:var(--color-primary);color:var(--color-on-primary)}.sl-menu--pointer-interactions .sl-menu-item:not(.sl-menu-item--disabled):hover .sl-menu-item__description{color:var(--color-on-primary)}.sl-menu--pointer-interactions .sl-menu-item:not(.sl-menu-item--disabled):hover .sl-menu-item__icon{color:var(--color-on-primary)!important}.sl-menu-item__link-icon,.sl-menu-item__meta-text{opacity:.6}.sl-menu-item--disabled .sl-menu-item__title-wrapper{cursor:not-allowed;opacity:.5}.sl-menu-item--disabled .sl-menu-item__meta-text{cursor:not-allowed;opacity:.4}.sl-menu-item--focused{background-color:var(--color-primary);color:var(--color-on-primary)}.sl-menu-item--focused .sl-menu-item__link-icon,.sl-menu-item--focused .sl-menu-item__meta-text{opacity:1}.sl-menu-item--focused .sl-menu-item__description{color:var(--color-on-primary)}.sl-menu-item--focused .sl-menu-item__icon{color:var(--color-on-primary)!important}.sl-menu-item--submenu-active{background-color:var(--color-primary-tint)}.sl-menu-item__title-wrapper{max-width:250px}.sl-menu-item__description{-webkit-line-clamp:2;-webkit-box-orient:vertical;display:-webkit-box;overflow:hidden}.sl-popover{--tw-blur:var(--tw-empty,/*!*/ /*!*/);--tw-brightness:var(--tw-empty,/*!*/ /*!*/);--tw-contrast:var(--tw-empty,/*!*/ /*!*/);--tw-grayscale:var(--tw-empty,/*!*/ /*!*/);--tw-hue-rotate:var(--tw-empty,/*!*/ /*!*/);--tw-invert:var(--tw-empty,/*!*/ /*!*/);--tw-saturate:var(--tw-empty,/*!*/ /*!*/);--tw-sepia:var(--tw-empty,/*!*/ /*!*/);--tw-drop-shadow:var(--tw-empty,/*!*/ /*!*/);--tw-drop-shadow:drop-shadow(var(--drop-shadow-default1)) drop-shadow(var(--drop-shadow-default2));border-radius:2px;filter:var(--tw-blur) var(--tw-brightness) var(--tw-contrast) var(--tw-grayscale) var(--tw-hue-rotate) var(--tw-invert) var(--tw-saturate) var(--tw-sepia) var(--tw-drop-shadow)}.sl-popover>:not(.sl-popover__tip){border-radius:2px;position:relative;z-index:10}.sl-prose{-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale;--fs-paragraph:1em;--fs-paragraph-small:0.875em;--fs-code:0.875em;font-family:var(--font-prose);font-size:16px;line-height:var(--lh-paragraph)}.sl-prose>:first-child{margin-top:0}.sl-prose>:last-child{margin-bottom:0}.sl-prose h1{font-size:2.25em}.sl-prose>h1{margin-bottom:1.11em;margin-top:0}.sl-prose h2{font-size:1.75em;line-height:1.3333333}.sl-prose>h2{margin-bottom:1em;margin-top:1.428em}.sl-prose h3{font-size:1.25em}.sl-prose>h3{margin-bottom:.8em;margin-top:2em}.sl-prose h4{font-size:1em}.sl-prose>h4{margin-bottom:.5em;margin-top:2em}.sl-prose h2+*,.sl-prose h3+*,.sl-prose h4+*{margin-top:0}.sl-prose strong{font-weight:600}.sl-prose .sl-text-lg{font-size:.875em}.sl-prose p{color:var(--color-text-paragraph);font-size:var(--fs-paragraph);margin-bottom:1em;margin-top:1em}.sl-prose p:first-child{margin-top:0}.sl-prose p:last-child{margin-bottom:0}.sl-prose p>a>img{display:inline}.sl-prose caption a,.sl-prose figcaption a,.sl-prose li a,.sl-prose p a,.sl-prose table a{color:var(--color-link)}.sl-prose caption a:hover,.sl-prose figcaption a:hover,.sl-prose li a:hover,.sl-prose p a:hover,.sl-prose table a:hover{color:var(--color-link-dark)}.sl-prose caption a,.sl-prose figcaption a,.sl-prose li a,.sl-prose p a,.sl-prose table a{--color-link:var(--color-text-primary);--color-link-dark:var(--color-primary-dark)}.sl-prose hr{margin-bottom:1em;margin-top:1em}.sl-prose .sl-live-code{margin:1.25em -4px;table-layout:auto;width:100%}.sl-prose .sl-live-code__inner>pre{margin-bottom:0;margin-top:0}.sl-prose .sl-callout,.sl-prose ol,.sl-prose ul{margin-bottom:1.5em;margin-top:1.5em}.sl-prose ol,.sl-prose ul{line-height:var(--lh-paragraph)}.sl-prose ol li,.sl-prose ul li{padding-left:2em}.sl-prose ol li{position:relative}.sl-prose ol li:before{content:counter(list-item) ". ";font-variant-numeric:tabular-nums}.sl-prose ul:not(.contains-task-list) li{position:relative}.sl-prose ul:not(.contains-task-list) li:before{background-color:var(--color-text);border-radius:50%;content:"";height:.375em;left:.75em;opacity:.7;position:absolute;top:.625em;width:.375em}.sl-prose li{margin-bottom:4px;margin-top:4px;padding-left:1.75em}.sl-prose li p{display:inline;margin-bottom:.75em;margin-top:.75em}.sl-prose li>:first-child{margin-top:0}.sl-prose>ol p+:last-child,.sl-prose>ul p+:last-child{margin-bottom:.75em}.sl-prose ol ol,.sl-prose ol ul,.sl-prose ul ol,.sl-prose ul ul{margin-bottom:2px;margin-top:2px}.sl-prose ul.contains-task-list input{margin-left:-1.875em;margin-right:.625em;position:relative;top:1px}.sl-prose ul.contains-task-list p{margin-top:0}.sl-prose figure{margin-bottom:1.5em;margin-top:1.5em}.sl-prose figure figure,.sl-prose figure img,.sl-prose figure video{margin-bottom:0;margin-top:0}.sl-prose figure>figcaption,.sl-prose figure>figcaption p{color:var(--color-text-muted);font-size:var(--fs-paragraph-small);line-height:var(--lh-paragraph-small);margin-top:8px;padding-left:16px;padding-right:16px;text-align:center}.sl-prose blockquote p{margin-bottom:.5em;margin-top:.5em}.sl-prose table{font-size:var(--fs-paragraph-small);margin-bottom:1.5em;margin-left:-4px;margin-right:-4px;overflow-x:auto;table-layout:auto;width:100%}.sl-prose thead td,.sl-prose thead th{color:var(--color-text-muted);font-size:.857em;font-weight:500;padding:8px 12px;text-transform:uppercase}.sl-prose thead td:first-child,.sl-prose thead th:first-child{padding-left:4px}.sl-prose tbody{border-radius:5px;box-shadow:0 0 0 1px var(--color-border,currentColor)}.sl-prose tbody tr{border-top-width:1px}.sl-prose tbody tr:first-child{border-top:0}.sl-prose tbody tr:nth-child(2n){background-color:var(--color-canvas-tint)}.sl-prose td{margin:.625em .75em;padding:10px 12px;vertical-align:top}.sl-prose td:not([align=center],[align=right]),.sl-prose th:not([align=center],[align=right]){text-align:left}.sl-prose .mermaid{margin-bottom:1.5em;margin-top:1.5em}.sl-prose .mermaid>svg{border-radius:5px;border-width:1px;height:auto!important;padding:1.25em}.sl-prose .sl-code-group .mermaid,.sl-prose .sl-code-group pre{margin-top:0}.sl-svg-focus{filter:drop-shadow(0 0 1px hsla(var(--primary-h),80%,51%,.9))}.sl-radio-group__radio:hover{cursor:pointer}.sl-radio-group__radio--disabled{opacity:.6}.sl-radio-group__radio--disabled:hover{cursor:not-allowed}.sl-switch .sl-switch__indicator{transition:background-color .1s cubic-bezier(.4,1,.75,.9)}.sl-switch .sl-switch__indicator .sl-switch__icon{visibility:hidden}.sl-switch .sl-switch__indicator:before{background-color:var(--color-canvas);border-radius:50%;content:"";height:calc(100% - 4px);left:0;margin:2px;position:absolute;transition:left .1s cubic-bezier(.4,1,.75,.9);width:calc(50% - 4px)}.sl-switch input:checked:disabled~.sl-switch__indicator{background-color:var(--color-primary-light)}.sl-switch input:checked~.sl-switch__indicator{background-color:var(--color-primary)}.sl-switch input:checked~.sl-switch__indicator .sl-switch__icon{visibility:visible}.sl-switch input:checked~.sl-switch__indicator:before{left:50%}.sl-tooltip{--tw-blur:var(--tw-empty,/*!*/ /*!*/);--tw-brightness:var(--tw-empty,/*!*/ /*!*/);--tw-contrast:var(--tw-empty,/*!*/ /*!*/);--tw-grayscale:var(--tw-empty,/*!*/ /*!*/);--tw-hue-rotate:var(--tw-empty,/*!*/ /*!*/);--tw-invert:var(--tw-empty,/*!*/ /*!*/);--tw-saturate:var(--tw-empty,/*!*/ /*!*/);--tw-sepia:var(--tw-empty,/*!*/ /*!*/);--tw-drop-shadow:var(--tw-empty,/*!*/ /*!*/);--tw-drop-shadow:drop-shadow(var(--drop-shadow-default1)) drop-shadow(var(--drop-shadow-default2));border-radius:2px;filter:var(--tw-blur) var(--tw-brightness) var(--tw-contrast) var(--tw-grayscale) var(--tw-hue-rotate) var(--tw-invert) var(--tw-saturate) var(--tw-sepia) var(--tw-drop-shadow);font-size:11px;max-width:300px;padding:4px 6px}.sl-tooltip>:not(.sl-tooltip_tip){position:relative;z-index:10}input,textarea{background-color:transparent}.sl-focus-ring{--tw-ring-color:hsla(var(--primary-h),80%,61%,var(--tw-ring-opacity)) ;--tw-ring-opacity:0.5;--tw-ring-offset-shadow:var(--tw-ring-inset) 0 0 0 var(--tw-ring-offset-width) var(--tw-ring-offset-color);--tw-ring-shadow:var(--tw-ring-inset) 0 0 0 calc(3px + var(--tw-ring-offset-width)) var(--tw-ring-color);border-radius:2px;box-shadow:var(--tw-ring-offset-shadow),var(--tw-ring-shadow),var(--tw-shadow,0 0 #0000)}@media (max-width:479px){.sl-stack--horizontal.sm\:sl-stack--1>:not(style)~:not(style){margin-left:4px}.sl-stack--vertical.sm\:sl-stack--1>:not(style)~:not(style){margin-top:4px}.sl-stack--horizontal.sm\:sl-stack--2>:not(style)~:not(style){margin-left:8px}.sl-stack--vertical.sm\:sl-stack--2>:not(style)~:not(style){margin-top:8px}.sl-stack--horizontal.sm\:sl-stack--3>:not(style)~:not(style){margin-left:12px}.sl-stack--vertical.sm\:sl-stack--3>:not(style)~:not(style){margin-top:12px}.sl-stack--horizontal.sm\:sl-stack--4>:not(style)~:not(style){margin-left:16px}.sl-stack--vertical.sm\:sl-stack--4>:not(style)~:not(style){margin-top:16px}.sl-stack--horizontal.sm\:sl-stack--5>:not(style)~:not(style){margin-left:20px}.sl-stack--vertical.sm\:sl-stack--5>:not(style)~:not(style){margin-top:20px}.sl-stack--horizontal.sm\:sl-stack--6>:not(style)~:not(style){margin-left:24px}.sl-stack--vertical.sm\:sl-stack--6>:not(style)~:not(style){margin-top:24px}.sl-stack--horizontal.sm\:sl-stack--7>:not(style)~:not(style){margin-left:28px}.sl-stack--vertical.sm\:sl-stack--7>:not(style)~:not(style){margin-top:28px}.sl-stack--horizontal.sm\:sl-stack--8>:not(style)~:not(style){margin-left:32px}.sl-stack--vertical.sm\:sl-stack--8>:not(style)~:not(style){margin-top:32px}.sl-stack--horizontal.sm\:sl-stack--9>:not(style)~:not(style){margin-left:36px}.sl-stack--vertical.sm\:sl-stack--9>:not(style)~:not(style){margin-top:36px}.sl-stack--horizontal.sm\:sl-stack--10>:not(style)~:not(style){margin-left:40px}.sl-stack--vertical.sm\:sl-stack--10>:not(style)~:not(style){margin-top:40px}.sl-stack--horizontal.sm\:sl-stack--12>:not(style)~:not(style){margin-left:48px}.sl-stack--vertical.sm\:sl-stack--12>:not(style)~:not(style){margin-top:48px}.sl-stack--horizontal.sm\:sl-stack--14>:not(style)~:not(style){margin-left:56px}.sl-stack--vertical.sm\:sl-stack--14>:not(style)~:not(style){margin-top:56px}.sl-stack--horizontal.sm\:sl-stack--16>:not(style)~:not(style){margin-left:64px}.sl-stack--vertical.sm\:sl-stack--16>:not(style)~:not(style){margin-top:64px}.sl-stack--horizontal.sm\:sl-stack--20>:not(style)~:not(style){margin-left:80px}.sl-stack--vertical.sm\:sl-stack--20>:not(style)~:not(style){margin-top:80px}.sl-stack--horizontal.sm\:sl-stack--24>:not(style)~:not(style){margin-left:96px}.sl-stack--vertical.sm\:sl-stack--24>:not(style)~:not(style){margin-top:96px}.sl-stack--horizontal.sm\:sl-stack--32>:not(style)~:not(style){margin-left:128px}.sl-stack--vertical.sm\:sl-stack--32>:not(style)~:not(style){margin-top:128px}.sm\:sl-content-center{align-content:center}.sm\:sl-content-start{align-content:flex-start}.sm\:sl-content-end{align-content:flex-end}.sm\:sl-content-between{align-content:space-between}.sm\:sl-content-around{align-content:space-around}.sm\:sl-content-evenly{align-content:space-evenly}.sm\:sl-items-start{align-items:flex-start}.sm\:sl-items-end{align-items:flex-end}.sm\:sl-items-center{align-items:center}.sm\:sl-items-baseline{align-items:baseline}.sm\:sl-items-stretch{align-items:stretch}.sm\:sl-self-auto{align-self:auto}.sm\:sl-self-start{align-self:flex-start}.sm\:sl-self-end{align-self:flex-end}.sm\:sl-self-center{align-self:center}.sm\:sl-self-stretch{align-self:stretch}.sm\:sl-blur-0,.sm\:sl-blur-none{--tw-blur:blur(0)}.sm\:sl-blur-sm{--tw-blur:blur(4px)}.sm\:sl-blur{--tw-blur:blur(8px)}.sm\:sl-blur-md{--tw-blur:blur(12px)}.sm\:sl-blur-lg{--tw-blur:blur(16px)}.sm\:sl-blur-xl{--tw-blur:blur(24px)}.sm\:sl-blur-2xl{--tw-blur:blur(40px)}.sm\:sl-blur-3xl{--tw-blur:blur(64px)}.sm\:sl-block{display:block}.sm\:sl-inline-block{display:inline-block}.sm\:sl-inline{display:inline}.sm\:sl-flex{display:flex}.sm\:sl-inline-flex{display:inline-flex}.sm\:sl-table{display:table}.sm\:sl-inline-table{display:inline-table}.sm\:sl-table-caption{display:table-caption}.sm\:sl-table-cell{display:table-cell}.sm\:sl-table-column{display:table-column}.sm\:sl-table-column-group{display:table-column-group}.sm\:sl-table-footer-group{display:table-footer-group}.sm\:sl-table-header-group{display:table-header-group}.sm\:sl-table-row-group{display:table-row-group}.sm\:sl-table-row{display:table-row}.sm\:sl-flow-root{display:flow-root}.sm\:sl-grid{display:grid}.sm\:sl-inline-grid{display:inline-grid}.sm\:sl-contents{display:contents}.sm\:sl-list-item{display:list-item}.sm\:sl-hidden{display:none}.sm\:sl-drop-shadow{--tw-drop-shadow:drop-shadow(var(--drop-shadow-default1)) drop-shadow(var(--drop-shadow-default2))}.sm\:sl-flex-1{flex:1 1}.sm\:sl-flex-auto{flex:1 1 auto}.sm\:sl-flex-initial{flex:0 1 auto}.sm\:sl-flex-none{flex:none}.sm\:sl-flex-row{flex-direction:row}.sm\:sl-flex-row-reverse{flex-direction:row-reverse}.sm\:sl-flex-col{flex-direction:column}.sm\:sl-flex-col-reverse{flex-direction:column-reverse}.sm\:sl-flex-grow-0{flex-grow:0}.sm\:sl-flex-grow{flex-grow:1}.sm\:sl-flex-shrink-0{flex-shrink:0}.sm\:sl-flex-shrink{flex-shrink:1}.sm\:sl-flex-wrap{flex-wrap:wrap}.sm\:sl-flex-wrap-reverse{flex-wrap:wrap-reverse}.sm\:sl-flex-nowrap{flex-wrap:nowrap}.sm\:sl-h-0{height:0}.sm\:sl-h-1{height:4px}.sm\:sl-h-2{height:8px}.sm\:sl-h-3{height:12px}.sm\:sl-h-4{height:16px}.sm\:sl-h-5{height:20px}.sm\:sl-h-6{height:24px}.sm\:sl-h-7{height:28px}.sm\:sl-h-8{height:32px}.sm\:sl-h-9{height:36px}.sm\:sl-h-10{height:40px}.sm\:sl-h-11{height:44px}.sm\:sl-h-12{height:48px}.sm\:sl-h-14{height:56px}.sm\:sl-h-16{height:64px}.sm\:sl-h-20{height:80px}.sm\:sl-h-24{height:96px}.sm\:sl-h-28{height:112px}.sm\:sl-h-32{height:128px}.sm\:sl-h-36{height:144px}.sm\:sl-h-40{height:160px}.sm\:sl-h-44{height:176px}.sm\:sl-h-48{height:192px}.sm\:sl-h-52{height:208px}.sm\:sl-h-56{height:224px}.sm\:sl-h-60{height:240px}.sm\:sl-h-64{height:256px}.sm\:sl-h-72{height:288px}.sm\:sl-h-80{height:320px}.sm\:sl-h-96{height:384px}.sm\:sl-h-auto{height:auto}.sm\:sl-h-px{height:1px}.sm\:sl-h-0\.5{height:2px}.sm\:sl-h-1\.5{height:6px}.sm\:sl-h-2\.5{height:10px}.sm\:sl-h-3\.5{height:14px}.sm\:sl-h-4\.5{height:18px}.sm\:sl-h-xs{height:20px}.sm\:sl-h-sm{height:24px}.sm\:sl-h-md{height:32px}.sm\:sl-h-lg{height:36px}.sm\:sl-h-xl{height:44px}.sm\:sl-h-2xl{height:52px}.sm\:sl-h-3xl{height:60px}.sm\:sl-h-full{height:100%}.sm\:sl-h-screen{height:100vh}.sm\:sl-justify-start{justify-content:flex-start}.sm\:sl-justify-end{justify-content:flex-end}.sm\:sl-justify-center{justify-content:center}.sm\:sl-justify-between{justify-content:space-between}.sm\:sl-justify-around{justify-content:space-around}.sm\:sl-justify-evenly{justify-content:space-evenly}.sm\:sl-justify-items-start{justify-items:start}.sm\:sl-justify-items-end{justify-items:end}.sm\:sl-justify-items-center{justify-items:center}.sm\:sl-justify-items-stretch{justify-items:stretch}.sm\:sl-justify-self-auto{justify-self:auto}.sm\:sl-justify-self-start{justify-self:start}.sm\:sl-justify-self-end{justify-self:end}.sm\:sl-justify-self-center{justify-self:center}.sm\:sl-justify-self-stretch{justify-self:stretch}.sm\:sl-m-0{margin:0}.sm\:sl-m-1{margin:4px}.sm\:sl-m-2{margin:8px}.sm\:sl-m-3{margin:12px}.sm\:sl-m-4{margin:16px}.sm\:sl-m-5{margin:20px}.sm\:sl-m-6{margin:24px}.sm\:sl-m-7{margin:28px}.sm\:sl-m-8{margin:32px}.sm\:sl-m-9{margin:36px}.sm\:sl-m-10{margin:40px}.sm\:sl-m-11{margin:44px}.sm\:sl-m-12{margin:48px}.sm\:sl-m-14{margin:56px}.sm\:sl-m-16{margin:64px}.sm\:sl-m-20{margin:80px}.sm\:sl-m-24{margin:96px}.sm\:sl-m-28{margin:112px}.sm\:sl-m-32{margin:128px}.sm\:sl-m-36{margin:144px}.sm\:sl-m-40{margin:160px}.sm\:sl-m-44{margin:176px}.sm\:sl-m-48{margin:192px}.sm\:sl-m-52{margin:208px}.sm\:sl-m-56{margin:224px}.sm\:sl-m-60{margin:240px}.sm\:sl-m-64{margin:256px}.sm\:sl-m-72{margin:288px}.sm\:sl-m-80{margin:320px}.sm\:sl-m-96{margin:384px}.sm\:sl-m-auto{margin:auto}.sm\:sl-m-px{margin:1px}.sm\:sl-m-0\.5{margin:2px}.sm\:sl-m-1\.5{margin:6px}.sm\:sl-m-2\.5{margin:10px}.sm\:sl-m-3\.5{margin:14px}.sm\:sl-m-4\.5{margin:18px}.sm\:sl--m-0{margin:0}.sm\:sl--m-1{margin:-4px}.sm\:sl--m-2{margin:-8px}.sm\:sl--m-3{margin:-12px}.sm\:sl--m-4{margin:-16px}.sm\:sl--m-5{margin:-20px}.sm\:sl--m-6{margin:-24px}.sm\:sl--m-7{margin:-28px}.sm\:sl--m-8{margin:-32px}.sm\:sl--m-9{margin:-36px}.sm\:sl--m-10{margin:-40px}.sm\:sl--m-11{margin:-44px}.sm\:sl--m-12{margin:-48px}.sm\:sl--m-14{margin:-56px}.sm\:sl--m-16{margin:-64px}.sm\:sl--m-20{margin:-80px}.sm\:sl--m-24{margin:-96px}.sm\:sl--m-28{margin:-112px}.sm\:sl--m-32{margin:-128px}.sm\:sl--m-36{margin:-144px}.sm\:sl--m-40{margin:-160px}.sm\:sl--m-44{margin:-176px}.sm\:sl--m-48{margin:-192px}.sm\:sl--m-52{margin:-208px}.sm\:sl--m-56{margin:-224px}.sm\:sl--m-60{margin:-240px}.sm\:sl--m-64{margin:-256px}.sm\:sl--m-72{margin:-288px}.sm\:sl--m-80{margin:-320px}.sm\:sl--m-96{margin:-384px}.sm\:sl--m-px{margin:-1px}.sm\:sl--m-0\.5{margin:-2px}.sm\:sl--m-1\.5{margin:-6px}.sm\:sl--m-2\.5{margin:-10px}.sm\:sl--m-3\.5{margin:-14px}.sm\:sl--m-4\.5{margin:-18px}.sm\:sl-my-0{margin-bottom:0;margin-top:0}.sm\:sl-mx-0{margin-left:0;margin-right:0}.sm\:sl-my-1{margin-bottom:4px;margin-top:4px}.sm\:sl-mx-1{margin-left:4px;margin-right:4px}.sm\:sl-my-2{margin-bottom:8px;margin-top:8px}.sm\:sl-mx-2{margin-left:8px;margin-right:8px}.sm\:sl-my-3{margin-bottom:12px;margin-top:12px}.sm\:sl-mx-3{margin-left:12px;margin-right:12px}.sm\:sl-my-4{margin-bottom:16px;margin-top:16px}.sm\:sl-mx-4{margin-left:16px;margin-right:16px}.sm\:sl-my-5{margin-bottom:20px;margin-top:20px}.sm\:sl-mx-5{margin-left:20px;margin-right:20px}.sm\:sl-my-6{margin-bottom:24px;margin-top:24px}.sm\:sl-mx-6{margin-left:24px;margin-right:24px}.sm\:sl-my-7{margin-bottom:28px;margin-top:28px}.sm\:sl-mx-7{margin-left:28px;margin-right:28px}.sm\:sl-my-8{margin-bottom:32px;margin-top:32px}.sm\:sl-mx-8{margin-left:32px;margin-right:32px}.sm\:sl-my-9{margin-bottom:36px;margin-top:36px}.sm\:sl-mx-9{margin-left:36px;margin-right:36px}.sm\:sl-my-10{margin-bottom:40px;margin-top:40px}.sm\:sl-mx-10{margin-left:40px;margin-right:40px}.sm\:sl-my-11{margin-bottom:44px;margin-top:44px}.sm\:sl-mx-11{margin-left:44px;margin-right:44px}.sm\:sl-my-12{margin-bottom:48px;margin-top:48px}.sm\:sl-mx-12{margin-left:48px;margin-right:48px}.sm\:sl-my-14{margin-bottom:56px;margin-top:56px}.sm\:sl-mx-14{margin-left:56px;margin-right:56px}.sm\:sl-my-16{margin-bottom:64px;margin-top:64px}.sm\:sl-mx-16{margin-left:64px;margin-right:64px}.sm\:sl-my-20{margin-bottom:80px;margin-top:80px}.sm\:sl-mx-20{margin-left:80px;margin-right:80px}.sm\:sl-my-24{margin-bottom:96px;margin-top:96px}.sm\:sl-mx-24{margin-left:96px;margin-right:96px}.sm\:sl-my-28{margin-bottom:112px;margin-top:112px}.sm\:sl-mx-28{margin-left:112px;margin-right:112px}.sm\:sl-my-32{margin-bottom:128px;margin-top:128px}.sm\:sl-mx-32{margin-left:128px;margin-right:128px}.sm\:sl-my-36{margin-bottom:144px;margin-top:144px}.sm\:sl-mx-36{margin-left:144px;margin-right:144px}.sm\:sl-my-40{margin-bottom:160px;margin-top:160px}.sm\:sl-mx-40{margin-left:160px;margin-right:160px}.sm\:sl-my-44{margin-bottom:176px;margin-top:176px}.sm\:sl-mx-44{margin-left:176px;margin-right:176px}.sm\:sl-my-48{margin-bottom:192px;margin-top:192px}.sm\:sl-mx-48{margin-left:192px;margin-right:192px}.sm\:sl-my-52{margin-bottom:208px;margin-top:208px}.sm\:sl-mx-52{margin-left:208px;margin-right:208px}.sm\:sl-my-56{margin-bottom:224px;margin-top:224px}.sm\:sl-mx-56{margin-left:224px;margin-right:224px}.sm\:sl-my-60{margin-bottom:240px;margin-top:240px}.sm\:sl-mx-60{margin-left:240px;margin-right:240px}.sm\:sl-my-64{margin-bottom:256px;margin-top:256px}.sm\:sl-mx-64{margin-left:256px;margin-right:256px}.sm\:sl-my-72{margin-bottom:288px;margin-top:288px}.sm\:sl-mx-72{margin-left:288px;margin-right:288px}.sm\:sl-my-80{margin-bottom:320px;margin-top:320px}.sm\:sl-mx-80{margin-left:320px;margin-right:320px}.sm\:sl-my-96{margin-bottom:384px;margin-top:384px}.sm\:sl-mx-96{margin-left:384px;margin-right:384px}.sm\:sl-my-auto{margin-bottom:auto;margin-top:auto}.sm\:sl-mx-auto{margin-left:auto;margin-right:auto}.sm\:sl-my-px{margin-bottom:1px;margin-top:1px}.sm\:sl-mx-px{margin-left:1px;margin-right:1px}.sm\:sl-my-0\.5{margin-bottom:2px;margin-top:2px}.sm\:sl-mx-0\.5{margin-left:2px;margin-right:2px}.sm\:sl-my-1\.5{margin-bottom:6px;margin-top:6px}.sm\:sl-mx-1\.5{margin-left:6px;margin-right:6px}.sm\:sl-my-2\.5{margin-bottom:10px;margin-top:10px}.sm\:sl-mx-2\.5{margin-left:10px;margin-right:10px}.sm\:sl-my-3\.5{margin-bottom:14px;margin-top:14px}.sm\:sl-mx-3\.5{margin-left:14px;margin-right:14px}.sm\:sl-my-4\.5{margin-bottom:18px;margin-top:18px}.sm\:sl-mx-4\.5{margin-left:18px;margin-right:18px}.sm\:sl--my-0{margin-bottom:0;margin-top:0}.sm\:sl--mx-0{margin-left:0;margin-right:0}.sm\:sl--my-1{margin-bottom:-4px;margin-top:-4px}.sm\:sl--mx-1{margin-left:-4px;margin-right:-4px}.sm\:sl--my-2{margin-bottom:-8px;margin-top:-8px}.sm\:sl--mx-2{margin-left:-8px;margin-right:-8px}.sm\:sl--my-3{margin-bottom:-12px;margin-top:-12px}.sm\:sl--mx-3{margin-left:-12px;margin-right:-12px}.sm\:sl--my-4{margin-bottom:-16px;margin-top:-16px}.sm\:sl--mx-4{margin-left:-16px;margin-right:-16px}.sm\:sl--my-5{margin-bottom:-20px;margin-top:-20px}.sm\:sl--mx-5{margin-left:-20px;margin-right:-20px}.sm\:sl--my-6{margin-bottom:-24px;margin-top:-24px}.sm\:sl--mx-6{margin-left:-24px;margin-right:-24px}.sm\:sl--my-7{margin-bottom:-28px;margin-top:-28px}.sm\:sl--mx-7{margin-left:-28px;margin-right:-28px}.sm\:sl--my-8{margin-bottom:-32px;margin-top:-32px}.sm\:sl--mx-8{margin-left:-32px;margin-right:-32px}.sm\:sl--my-9{margin-bottom:-36px;margin-top:-36px}.sm\:sl--mx-9{margin-left:-36px;margin-right:-36px}.sm\:sl--my-10{margin-bottom:-40px;margin-top:-40px}.sm\:sl--mx-10{margin-left:-40px;margin-right:-40px}.sm\:sl--my-11{margin-bottom:-44px;margin-top:-44px}.sm\:sl--mx-11{margin-left:-44px;margin-right:-44px}.sm\:sl--my-12{margin-bottom:-48px;margin-top:-48px}.sm\:sl--mx-12{margin-left:-48px;margin-right:-48px}.sm\:sl--my-14{margin-bottom:-56px;margin-top:-56px}.sm\:sl--mx-14{margin-left:-56px;margin-right:-56px}.sm\:sl--my-16{margin-bottom:-64px;margin-top:-64px}.sm\:sl--mx-16{margin-left:-64px;margin-right:-64px}.sm\:sl--my-20{margin-bottom:-80px;margin-top:-80px}.sm\:sl--mx-20{margin-left:-80px;margin-right:-80px}.sm\:sl--my-24{margin-bottom:-96px;margin-top:-96px}.sm\:sl--mx-24{margin-left:-96px;margin-right:-96px}.sm\:sl--my-28{margin-bottom:-112px;margin-top:-112px}.sm\:sl--mx-28{margin-left:-112px;margin-right:-112px}.sm\:sl--my-32{margin-bottom:-128px;margin-top:-128px}.sm\:sl--mx-32{margin-left:-128px;margin-right:-128px}.sm\:sl--my-36{margin-bottom:-144px;margin-top:-144px}.sm\:sl--mx-36{margin-left:-144px;margin-right:-144px}.sm\:sl--my-40{margin-bottom:-160px;margin-top:-160px}.sm\:sl--mx-40{margin-left:-160px;margin-right:-160px}.sm\:sl--my-44{margin-bottom:-176px;margin-top:-176px}.sm\:sl--mx-44{margin-left:-176px;margin-right:-176px}.sm\:sl--my-48{margin-bottom:-192px;margin-top:-192px}.sm\:sl--mx-48{margin-left:-192px;margin-right:-192px}.sm\:sl--my-52{margin-bottom:-208px;margin-top:-208px}.sm\:sl--mx-52{margin-left:-208px;margin-right:-208px}.sm\:sl--my-56{margin-bottom:-224px;margin-top:-224px}.sm\:sl--mx-56{margin-left:-224px;margin-right:-224px}.sm\:sl--my-60{margin-bottom:-240px;margin-top:-240px}.sm\:sl--mx-60{margin-left:-240px;margin-right:-240px}.sm\:sl--my-64{margin-bottom:-256px;margin-top:-256px}.sm\:sl--mx-64{margin-left:-256px;margin-right:-256px}.sm\:sl--my-72{margin-bottom:-288px;margin-top:-288px}.sm\:sl--mx-72{margin-left:-288px;margin-right:-288px}.sm\:sl--my-80{margin-bottom:-320px;margin-top:-320px}.sm\:sl--mx-80{margin-left:-320px;margin-right:-320px}.sm\:sl--my-96{margin-bottom:-384px;margin-top:-384px}.sm\:sl--mx-96{margin-left:-384px;margin-right:-384px}.sm\:sl--my-px{margin-bottom:-1px;margin-top:-1px}.sm\:sl--mx-px{margin-left:-1px;margin-right:-1px}.sm\:sl--my-0\.5{margin-bottom:-2px;margin-top:-2px}.sm\:sl--mx-0\.5{margin-left:-2px;margin-right:-2px}.sm\:sl--my-1\.5{margin-bottom:-6px;margin-top:-6px}.sm\:sl--mx-1\.5{margin-left:-6px;margin-right:-6px}.sm\:sl--my-2\.5{margin-bottom:-10px;margin-top:-10px}.sm\:sl--mx-2\.5{margin-left:-10px;margin-right:-10px}.sm\:sl--my-3\.5{margin-bottom:-14px;margin-top:-14px}.sm\:sl--mx-3\.5{margin-left:-14px;margin-right:-14px}.sm\:sl--my-4\.5{margin-bottom:-18px;margin-top:-18px}.sm\:sl--mx-4\.5{margin-left:-18px;margin-right:-18px}.sm\:sl-mt-0{margin-top:0}.sm\:sl-mr-0{margin-right:0}.sm\:sl-mb-0{margin-bottom:0}.sm\:sl-ml-0{margin-left:0}.sm\:sl-mt-1{margin-top:4px}.sm\:sl-mr-1{margin-right:4px}.sm\:sl-mb-1{margin-bottom:4px}.sm\:sl-ml-1{margin-left:4px}.sm\:sl-mt-2{margin-top:8px}.sm\:sl-mr-2{margin-right:8px}.sm\:sl-mb-2{margin-bottom:8px}.sm\:sl-ml-2{margin-left:8px}.sm\:sl-mt-3{margin-top:12px}.sm\:sl-mr-3{margin-right:12px}.sm\:sl-mb-3{margin-bottom:12px}.sm\:sl-ml-3{margin-left:12px}.sm\:sl-mt-4{margin-top:16px}.sm\:sl-mr-4{margin-right:16px}.sm\:sl-mb-4{margin-bottom:16px}.sm\:sl-ml-4{margin-left:16px}.sm\:sl-mt-5{margin-top:20px}.sm\:sl-mr-5{margin-right:20px}.sm\:sl-mb-5{margin-bottom:20px}.sm\:sl-ml-5{margin-left:20px}.sm\:sl-mt-6{margin-top:24px}.sm\:sl-mr-6{margin-right:24px}.sm\:sl-mb-6{margin-bottom:24px}.sm\:sl-ml-6{margin-left:24px}.sm\:sl-mt-7{margin-top:28px}.sm\:sl-mr-7{margin-right:28px}.sm\:sl-mb-7{margin-bottom:28px}.sm\:sl-ml-7{margin-left:28px}.sm\:sl-mt-8{margin-top:32px}.sm\:sl-mr-8{margin-right:32px}.sm\:sl-mb-8{margin-bottom:32px}.sm\:sl-ml-8{margin-left:32px}.sm\:sl-mt-9{margin-top:36px}.sm\:sl-mr-9{margin-right:36px}.sm\:sl-mb-9{margin-bottom:36px}.sm\:sl-ml-9{margin-left:36px}.sm\:sl-mt-10{margin-top:40px}.sm\:sl-mr-10{margin-right:40px}.sm\:sl-mb-10{margin-bottom:40px}.sm\:sl-ml-10{margin-left:40px}.sm\:sl-mt-11{margin-top:44px}.sm\:sl-mr-11{margin-right:44px}.sm\:sl-mb-11{margin-bottom:44px}.sm\:sl-ml-11{margin-left:44px}.sm\:sl-mt-12{margin-top:48px}.sm\:sl-mr-12{margin-right:48px}.sm\:sl-mb-12{margin-bottom:48px}.sm\:sl-ml-12{margin-left:48px}.sm\:sl-mt-14{margin-top:56px}.sm\:sl-mr-14{margin-right:56px}.sm\:sl-mb-14{margin-bottom:56px}.sm\:sl-ml-14{margin-left:56px}.sm\:sl-mt-16{margin-top:64px}.sm\:sl-mr-16{margin-right:64px}.sm\:sl-mb-16{margin-bottom:64px}.sm\:sl-ml-16{margin-left:64px}.sm\:sl-mt-20{margin-top:80px}.sm\:sl-mr-20{margin-right:80px}.sm\:sl-mb-20{margin-bottom:80px}.sm\:sl-ml-20{margin-left:80px}.sm\:sl-mt-24{margin-top:96px}.sm\:sl-mr-24{margin-right:96px}.sm\:sl-mb-24{margin-bottom:96px}.sm\:sl-ml-24{margin-left:96px}.sm\:sl-mt-28{margin-top:112px}.sm\:sl-mr-28{margin-right:112px}.sm\:sl-mb-28{margin-bottom:112px}.sm\:sl-ml-28{margin-left:112px}.sm\:sl-mt-32{margin-top:128px}.sm\:sl-mr-32{margin-right:128px}.sm\:sl-mb-32{margin-bottom:128px}.sm\:sl-ml-32{margin-left:128px}.sm\:sl-mt-36{margin-top:144px}.sm\:sl-mr-36{margin-right:144px}.sm\:sl-mb-36{margin-bottom:144px}.sm\:sl-ml-36{margin-left:144px}.sm\:sl-mt-40{margin-top:160px}.sm\:sl-mr-40{margin-right:160px}.sm\:sl-mb-40{margin-bottom:160px}.sm\:sl-ml-40{margin-left:160px}.sm\:sl-mt-44{margin-top:176px}.sm\:sl-mr-44{margin-right:176px}.sm\:sl-mb-44{margin-bottom:176px}.sm\:sl-ml-44{margin-left:176px}.sm\:sl-mt-48{margin-top:192px}.sm\:sl-mr-48{margin-right:192px}.sm\:sl-mb-48{margin-bottom:192px}.sm\:sl-ml-48{margin-left:192px}.sm\:sl-mt-52{margin-top:208px}.sm\:sl-mr-52{margin-right:208px}.sm\:sl-mb-52{margin-bottom:208px}.sm\:sl-ml-52{margin-left:208px}.sm\:sl-mt-56{margin-top:224px}.sm\:sl-mr-56{margin-right:224px}.sm\:sl-mb-56{margin-bottom:224px}.sm\:sl-ml-56{margin-left:224px}.sm\:sl-mt-60{margin-top:240px}.sm\:sl-mr-60{margin-right:240px}.sm\:sl-mb-60{margin-bottom:240px}.sm\:sl-ml-60{margin-left:240px}.sm\:sl-mt-64{margin-top:256px}.sm\:sl-mr-64{margin-right:256px}.sm\:sl-mb-64{margin-bottom:256px}.sm\:sl-ml-64{margin-left:256px}.sm\:sl-mt-72{margin-top:288px}.sm\:sl-mr-72{margin-right:288px}.sm\:sl-mb-72{margin-bottom:288px}.sm\:sl-ml-72{margin-left:288px}.sm\:sl-mt-80{margin-top:320px}.sm\:sl-mr-80{margin-right:320px}.sm\:sl-mb-80{margin-bottom:320px}.sm\:sl-ml-80{margin-left:320px}.sm\:sl-mt-96{margin-top:384px}.sm\:sl-mr-96{margin-right:384px}.sm\:sl-mb-96{margin-bottom:384px}.sm\:sl-ml-96{margin-left:384px}.sm\:sl-mt-auto{margin-top:auto}.sm\:sl-mr-auto{margin-right:auto}.sm\:sl-mb-auto{margin-bottom:auto}.sm\:sl-ml-auto{margin-left:auto}.sm\:sl-mt-px{margin-top:1px}.sm\:sl-mr-px{margin-right:1px}.sm\:sl-mb-px{margin-bottom:1px}.sm\:sl-ml-px{margin-left:1px}.sm\:sl-mt-0\.5{margin-top:2px}.sm\:sl-mr-0\.5{margin-right:2px}.sm\:sl-mb-0\.5{margin-bottom:2px}.sm\:sl-ml-0\.5{margin-left:2px}.sm\:sl-mt-1\.5{margin-top:6px}.sm\:sl-mr-1\.5{margin-right:6px}.sm\:sl-mb-1\.5{margin-bottom:6px}.sm\:sl-ml-1\.5{margin-left:6px}.sm\:sl-mt-2\.5{margin-top:10px}.sm\:sl-mr-2\.5{margin-right:10px}.sm\:sl-mb-2\.5{margin-bottom:10px}.sm\:sl-ml-2\.5{margin-left:10px}.sm\:sl-mt-3\.5{margin-top:14px}.sm\:sl-mr-3\.5{margin-right:14px}.sm\:sl-mb-3\.5{margin-bottom:14px}.sm\:sl-ml-3\.5{margin-left:14px}.sm\:sl-mt-4\.5{margin-top:18px}.sm\:sl-mr-4\.5{margin-right:18px}.sm\:sl-mb-4\.5{margin-bottom:18px}.sm\:sl-ml-4\.5{margin-left:18px}.sm\:sl--mt-0{margin-top:0}.sm\:sl--mr-0{margin-right:0}.sm\:sl--mb-0{margin-bottom:0}.sm\:sl--ml-0{margin-left:0}.sm\:sl--mt-1{margin-top:-4px}.sm\:sl--mr-1{margin-right:-4px}.sm\:sl--mb-1{margin-bottom:-4px}.sm\:sl--ml-1{margin-left:-4px}.sm\:sl--mt-2{margin-top:-8px}.sm\:sl--mr-2{margin-right:-8px}.sm\:sl--mb-2{margin-bottom:-8px}.sm\:sl--ml-2{margin-left:-8px}.sm\:sl--mt-3{margin-top:-12px}.sm\:sl--mr-3{margin-right:-12px}.sm\:sl--mb-3{margin-bottom:-12px}.sm\:sl--ml-3{margin-left:-12px}.sm\:sl--mt-4{margin-top:-16px}.sm\:sl--mr-4{margin-right:-16px}.sm\:sl--mb-4{margin-bottom:-16px}.sm\:sl--ml-4{margin-left:-16px}.sm\:sl--mt-5{margin-top:-20px}.sm\:sl--mr-5{margin-right:-20px}.sm\:sl--mb-5{margin-bottom:-20px}.sm\:sl--ml-5{margin-left:-20px}.sm\:sl--mt-6{margin-top:-24px}.sm\:sl--mr-6{margin-right:-24px}.sm\:sl--mb-6{margin-bottom:-24px}.sm\:sl--ml-6{margin-left:-24px}.sm\:sl--mt-7{margin-top:-28px}.sm\:sl--mr-7{margin-right:-28px}.sm\:sl--mb-7{margin-bottom:-28px}.sm\:sl--ml-7{margin-left:-28px}.sm\:sl--mt-8{margin-top:-32px}.sm\:sl--mr-8{margin-right:-32px}.sm\:sl--mb-8{margin-bottom:-32px}.sm\:sl--ml-8{margin-left:-32px}.sm\:sl--mt-9{margin-top:-36px}.sm\:sl--mr-9{margin-right:-36px}.sm\:sl--mb-9{margin-bottom:-36px}.sm\:sl--ml-9{margin-left:-36px}.sm\:sl--mt-10{margin-top:-40px}.sm\:sl--mr-10{margin-right:-40px}.sm\:sl--mb-10{margin-bottom:-40px}.sm\:sl--ml-10{margin-left:-40px}.sm\:sl--mt-11{margin-top:-44px}.sm\:sl--mr-11{margin-right:-44px}.sm\:sl--mb-11{margin-bottom:-44px}.sm\:sl--ml-11{margin-left:-44px}.sm\:sl--mt-12{margin-top:-48px}.sm\:sl--mr-12{margin-right:-48px}.sm\:sl--mb-12{margin-bottom:-48px}.sm\:sl--ml-12{margin-left:-48px}.sm\:sl--mt-14{margin-top:-56px}.sm\:sl--mr-14{margin-right:-56px}.sm\:sl--mb-14{margin-bottom:-56px}.sm\:sl--ml-14{margin-left:-56px}.sm\:sl--mt-16{margin-top:-64px}.sm\:sl--mr-16{margin-right:-64px}.sm\:sl--mb-16{margin-bottom:-64px}.sm\:sl--ml-16{margin-left:-64px}.sm\:sl--mt-20{margin-top:-80px}.sm\:sl--mr-20{margin-right:-80px}.sm\:sl--mb-20{margin-bottom:-80px}.sm\:sl--ml-20{margin-left:-80px}.sm\:sl--mt-24{margin-top:-96px}.sm\:sl--mr-24{margin-right:-96px}.sm\:sl--mb-24{margin-bottom:-96px}.sm\:sl--ml-24{margin-left:-96px}.sm\:sl--mt-28{margin-top:-112px}.sm\:sl--mr-28{margin-right:-112px}.sm\:sl--mb-28{margin-bottom:-112px}.sm\:sl--ml-28{margin-left:-112px}.sm\:sl--mt-32{margin-top:-128px}.sm\:sl--mr-32{margin-right:-128px}.sm\:sl--mb-32{margin-bottom:-128px}.sm\:sl--ml-32{margin-left:-128px}.sm\:sl--mt-36{margin-top:-144px}.sm\:sl--mr-36{margin-right:-144px}.sm\:sl--mb-36{margin-bottom:-144px}.sm\:sl--ml-36{margin-left:-144px}.sm\:sl--mt-40{margin-top:-160px}.sm\:sl--mr-40{margin-right:-160px}.sm\:sl--mb-40{margin-bottom:-160px}.sm\:sl--ml-40{margin-left:-160px}.sm\:sl--mt-44{margin-top:-176px}.sm\:sl--mr-44{margin-right:-176px}.sm\:sl--mb-44{margin-bottom:-176px}.sm\:sl--ml-44{margin-left:-176px}.sm\:sl--mt-48{margin-top:-192px}.sm\:sl--mr-48{margin-right:-192px}.sm\:sl--mb-48{margin-bottom:-192px}.sm\:sl--ml-48{margin-left:-192px}.sm\:sl--mt-52{margin-top:-208px}.sm\:sl--mr-52{margin-right:-208px}.sm\:sl--mb-52{margin-bottom:-208px}.sm\:sl--ml-52{margin-left:-208px}.sm\:sl--mt-56{margin-top:-224px}.sm\:sl--mr-56{margin-right:-224px}.sm\:sl--mb-56{margin-bottom:-224px}.sm\:sl--ml-56{margin-left:-224px}.sm\:sl--mt-60{margin-top:-240px}.sm\:sl--mr-60{margin-right:-240px}.sm\:sl--mb-60{margin-bottom:-240px}.sm\:sl--ml-60{margin-left:-240px}.sm\:sl--mt-64{margin-top:-256px}.sm\:sl--mr-64{margin-right:-256px}.sm\:sl--mb-64{margin-bottom:-256px}.sm\:sl--ml-64{margin-left:-256px}.sm\:sl--mt-72{margin-top:-288px}.sm\:sl--mr-72{margin-right:-288px}.sm\:sl--mb-72{margin-bottom:-288px}.sm\:sl--ml-72{margin-left:-288px}.sm\:sl--mt-80{margin-top:-320px}.sm\:sl--mr-80{margin-right:-320px}.sm\:sl--mb-80{margin-bottom:-320px}.sm\:sl--ml-80{margin-left:-320px}.sm\:sl--mt-96{margin-top:-384px}.sm\:sl--mr-96{margin-right:-384px}.sm\:sl--mb-96{margin-bottom:-384px}.sm\:sl--ml-96{margin-left:-384px}.sm\:sl--mt-px{margin-top:-1px}.sm\:sl--mr-px{margin-right:-1px}.sm\:sl--mb-px{margin-bottom:-1px}.sm\:sl--ml-px{margin-left:-1px}.sm\:sl--mt-0\.5{margin-top:-2px}.sm\:sl--mr-0\.5{margin-right:-2px}.sm\:sl--mb-0\.5{margin-bottom:-2px}.sm\:sl--ml-0\.5{margin-left:-2px}.sm\:sl--mt-1\.5{margin-top:-6px}.sm\:sl--mr-1\.5{margin-right:-6px}.sm\:sl--mb-1\.5{margin-bottom:-6px}.sm\:sl--ml-1\.5{margin-left:-6px}.sm\:sl--mt-2\.5{margin-top:-10px}.sm\:sl--mr-2\.5{margin-right:-10px}.sm\:sl--mb-2\.5{margin-bottom:-10px}.sm\:sl--ml-2\.5{margin-left:-10px}.sm\:sl--mt-3\.5{margin-top:-14px}.sm\:sl--mr-3\.5{margin-right:-14px}.sm\:sl--mb-3\.5{margin-bottom:-14px}.sm\:sl--ml-3\.5{margin-left:-14px}.sm\:sl--mt-4\.5{margin-top:-18px}.sm\:sl--mr-4\.5{margin-right:-18px}.sm\:sl--mb-4\.5{margin-bottom:-18px}.sm\:sl--ml-4\.5{margin-left:-18px}.sm\:sl-max-h-full{max-height:100%}.sm\:sl-max-h-screen{max-height:100vh}.sm\:sl-max-w-none{max-width:none}.sm\:sl-max-w-full{max-width:100%}.sm\:sl-max-w-min{max-width:-moz-min-content;max-width:min-content}.sm\:sl-max-w-max{max-width:-moz-max-content;max-width:max-content}.sm\:sl-max-w-prose{max-width:65ch}.sm\:sl-min-h-full{min-height:100%}.sm\:sl-min-h-screen{min-height:100vh}.sm\:sl-min-w-full{min-width:100%}.sm\:sl-min-w-min{min-width:-moz-min-content;min-width:min-content}.sm\:sl-min-w-max{min-width:-moz-max-content;min-width:max-content}.sm\:sl-p-0{padding:0}.sm\:sl-p-1{padding:4px}.sm\:sl-p-2{padding:8px}.sm\:sl-p-3{padding:12px}.sm\:sl-p-4{padding:16px}.sm\:sl-p-5{padding:20px}.sm\:sl-p-6{padding:24px}.sm\:sl-p-7{padding:28px}.sm\:sl-p-8{padding:32px}.sm\:sl-p-9{padding:36px}.sm\:sl-p-10{padding:40px}.sm\:sl-p-11{padding:44px}.sm\:sl-p-12{padding:48px}.sm\:sl-p-14{padding:56px}.sm\:sl-p-16{padding:64px}.sm\:sl-p-20{padding:80px}.sm\:sl-p-24{padding:96px}.sm\:sl-p-28{padding:112px}.sm\:sl-p-32{padding:128px}.sm\:sl-p-36{padding:144px}.sm\:sl-p-40{padding:160px}.sm\:sl-p-44{padding:176px}.sm\:sl-p-48{padding:192px}.sm\:sl-p-52{padding:208px}.sm\:sl-p-56{padding:224px}.sm\:sl-p-60{padding:240px}.sm\:sl-p-64{padding:256px}.sm\:sl-p-72{padding:288px}.sm\:sl-p-80{padding:320px}.sm\:sl-p-96{padding:384px}.sm\:sl-p-px{padding:1px}.sm\:sl-p-0\.5{padding:2px}.sm\:sl-p-1\.5{padding:6px}.sm\:sl-p-2\.5{padding:10px}.sm\:sl-p-3\.5{padding:14px}.sm\:sl-p-4\.5{padding:18px}.sm\:sl-py-0{padding-bottom:0;padding-top:0}.sm\:sl-px-0{padding-left:0;padding-right:0}.sm\:sl-py-1{padding-bottom:4px;padding-top:4px}.sm\:sl-px-1{padding-left:4px;padding-right:4px}.sm\:sl-py-2{padding-bottom:8px;padding-top:8px}.sm\:sl-px-2{padding-left:8px;padding-right:8px}.sm\:sl-py-3{padding-bottom:12px;padding-top:12px}.sm\:sl-px-3{padding-left:12px;padding-right:12px}.sm\:sl-py-4{padding-bottom:16px;padding-top:16px}.sm\:sl-px-4{padding-left:16px;padding-right:16px}.sm\:sl-py-5{padding-bottom:20px;padding-top:20px}.sm\:sl-px-5{padding-left:20px;padding-right:20px}.sm\:sl-py-6{padding-bottom:24px;padding-top:24px}.sm\:sl-px-6{padding-left:24px;padding-right:24px}.sm\:sl-py-7{padding-bottom:28px;padding-top:28px}.sm\:sl-px-7{padding-left:28px;padding-right:28px}.sm\:sl-py-8{padding-bottom:32px;padding-top:32px}.sm\:sl-px-8{padding-left:32px;padding-right:32px}.sm\:sl-py-9{padding-bottom:36px;padding-top:36px}.sm\:sl-px-9{padding-left:36px;padding-right:36px}.sm\:sl-py-10{padding-bottom:40px;padding-top:40px}.sm\:sl-px-10{padding-left:40px;padding-right:40px}.sm\:sl-py-11{padding-bottom:44px;padding-top:44px}.sm\:sl-px-11{padding-left:44px;padding-right:44px}.sm\:sl-py-12{padding-bottom:48px;padding-top:48px}.sm\:sl-px-12{padding-left:48px;padding-right:48px}.sm\:sl-py-14{padding-bottom:56px;padding-top:56px}.sm\:sl-px-14{padding-left:56px;padding-right:56px}.sm\:sl-py-16{padding-bottom:64px;padding-top:64px}.sm\:sl-px-16{padding-left:64px;padding-right:64px}.sm\:sl-py-20{padding-bottom:80px;padding-top:80px}.sm\:sl-px-20{padding-left:80px;padding-right:80px}.sm\:sl-py-24{padding-bottom:96px;padding-top:96px}.sm\:sl-px-24{padding-left:96px;padding-right:96px}.sm\:sl-py-28{padding-bottom:112px;padding-top:112px}.sm\:sl-px-28{padding-left:112px;padding-right:112px}.sm\:sl-py-32{padding-bottom:128px;padding-top:128px}.sm\:sl-px-32{padding-left:128px;padding-right:128px}.sm\:sl-py-36{padding-bottom:144px;padding-top:144px}.sm\:sl-px-36{padding-left:144px;padding-right:144px}.sm\:sl-py-40{padding-bottom:160px;padding-top:160px}.sm\:sl-px-40{padding-left:160px;padding-right:160px}.sm\:sl-py-44{padding-bottom:176px;padding-top:176px}.sm\:sl-px-44{padding-left:176px;padding-right:176px}.sm\:sl-py-48{padding-bottom:192px;padding-top:192px}.sm\:sl-px-48{padding-left:192px;padding-right:192px}.sm\:sl-py-52{padding-bottom:208px;padding-top:208px}.sm\:sl-px-52{padding-left:208px;padding-right:208px}.sm\:sl-py-56{padding-bottom:224px;padding-top:224px}.sm\:sl-px-56{padding-left:224px;padding-right:224px}.sm\:sl-py-60{padding-bottom:240px;padding-top:240px}.sm\:sl-px-60{padding-left:240px;padding-right:240px}.sm\:sl-py-64{padding-bottom:256px;padding-top:256px}.sm\:sl-px-64{padding-left:256px;padding-right:256px}.sm\:sl-py-72{padding-bottom:288px;padding-top:288px}.sm\:sl-px-72{padding-left:288px;padding-right:288px}.sm\:sl-py-80{padding-bottom:320px;padding-top:320px}.sm\:sl-px-80{padding-left:320px;padding-right:320px}.sm\:sl-py-96{padding-bottom:384px;padding-top:384px}.sm\:sl-px-96{padding-left:384px;padding-right:384px}.sm\:sl-py-px{padding-bottom:1px;padding-top:1px}.sm\:sl-px-px{padding-left:1px;padding-right:1px}.sm\:sl-py-0\.5{padding-bottom:2px;padding-top:2px}.sm\:sl-px-0\.5{padding-left:2px;padding-right:2px}.sm\:sl-py-1\.5{padding-bottom:6px;padding-top:6px}.sm\:sl-px-1\.5{padding-left:6px;padding-right:6px}.sm\:sl-py-2\.5{padding-bottom:10px;padding-top:10px}.sm\:sl-px-2\.5{padding-left:10px;padding-right:10px}.sm\:sl-py-3\.5{padding-bottom:14px;padding-top:14px}.sm\:sl-px-3\.5{padding-left:14px;padding-right:14px}.sm\:sl-py-4\.5{padding-bottom:18px;padding-top:18px}.sm\:sl-px-4\.5{padding-left:18px;padding-right:18px}.sm\:sl-pt-0{padding-top:0}.sm\:sl-pr-0{padding-right:0}.sm\:sl-pb-0{padding-bottom:0}.sm\:sl-pl-0{padding-left:0}.sm\:sl-pt-1{padding-top:4px}.sm\:sl-pr-1{padding-right:4px}.sm\:sl-pb-1{padding-bottom:4px}.sm\:sl-pl-1{padding-left:4px}.sm\:sl-pt-2{padding-top:8px}.sm\:sl-pr-2{padding-right:8px}.sm\:sl-pb-2{padding-bottom:8px}.sm\:sl-pl-2{padding-left:8px}.sm\:sl-pt-3{padding-top:12px}.sm\:sl-pr-3{padding-right:12px}.sm\:sl-pb-3{padding-bottom:12px}.sm\:sl-pl-3{padding-left:12px}.sm\:sl-pt-4{padding-top:16px}.sm\:sl-pr-4{padding-right:16px}.sm\:sl-pb-4{padding-bottom:16px}.sm\:sl-pl-4{padding-left:16px}.sm\:sl-pt-5{padding-top:20px}.sm\:sl-pr-5{padding-right:20px}.sm\:sl-pb-5{padding-bottom:20px}.sm\:sl-pl-5{padding-left:20px}.sm\:sl-pt-6{padding-top:24px}.sm\:sl-pr-6{padding-right:24px}.sm\:sl-pb-6{padding-bottom:24px}.sm\:sl-pl-6{padding-left:24px}.sm\:sl-pt-7{padding-top:28px}.sm\:sl-pr-7{padding-right:28px}.sm\:sl-pb-7{padding-bottom:28px}.sm\:sl-pl-7{padding-left:28px}.sm\:sl-pt-8{padding-top:32px}.sm\:sl-pr-8{padding-right:32px}.sm\:sl-pb-8{padding-bottom:32px}.sm\:sl-pl-8{padding-left:32px}.sm\:sl-pt-9{padding-top:36px}.sm\:sl-pr-9{padding-right:36px}.sm\:sl-pb-9{padding-bottom:36px}.sm\:sl-pl-9{padding-left:36px}.sm\:sl-pt-10{padding-top:40px}.sm\:sl-pr-10{padding-right:40px}.sm\:sl-pb-10{padding-bottom:40px}.sm\:sl-pl-10{padding-left:40px}.sm\:sl-pt-11{padding-top:44px}.sm\:sl-pr-11{padding-right:44px}.sm\:sl-pb-11{padding-bottom:44px}.sm\:sl-pl-11{padding-left:44px}.sm\:sl-pt-12{padding-top:48px}.sm\:sl-pr-12{padding-right:48px}.sm\:sl-pb-12{padding-bottom:48px}.sm\:sl-pl-12{padding-left:48px}.sm\:sl-pt-14{padding-top:56px}.sm\:sl-pr-14{padding-right:56px}.sm\:sl-pb-14{padding-bottom:56px}.sm\:sl-pl-14{padding-left:56px}.sm\:sl-pt-16{padding-top:64px}.sm\:sl-pr-16{padding-right:64px}.sm\:sl-pb-16{padding-bottom:64px}.sm\:sl-pl-16{padding-left:64px}.sm\:sl-pt-20{padding-top:80px}.sm\:sl-pr-20{padding-right:80px}.sm\:sl-pb-20{padding-bottom:80px}.sm\:sl-pl-20{padding-left:80px}.sm\:sl-pt-24{padding-top:96px}.sm\:sl-pr-24{padding-right:96px}.sm\:sl-pb-24{padding-bottom:96px}.sm\:sl-pl-24{padding-left:96px}.sm\:sl-pt-28{padding-top:112px}.sm\:sl-pr-28{padding-right:112px}.sm\:sl-pb-28{padding-bottom:112px}.sm\:sl-pl-28{padding-left:112px}.sm\:sl-pt-32{padding-top:128px}.sm\:sl-pr-32{padding-right:128px}.sm\:sl-pb-32{padding-bottom:128px}.sm\:sl-pl-32{padding-left:128px}.sm\:sl-pt-36{padding-top:144px}.sm\:sl-pr-36{padding-right:144px}.sm\:sl-pb-36{padding-bottom:144px}.sm\:sl-pl-36{padding-left:144px}.sm\:sl-pt-40{padding-top:160px}.sm\:sl-pr-40{padding-right:160px}.sm\:sl-pb-40{padding-bottom:160px}.sm\:sl-pl-40{padding-left:160px}.sm\:sl-pt-44{padding-top:176px}.sm\:sl-pr-44{padding-right:176px}.sm\:sl-pb-44{padding-bottom:176px}.sm\:sl-pl-44{padding-left:176px}.sm\:sl-pt-48{padding-top:192px}.sm\:sl-pr-48{padding-right:192px}.sm\:sl-pb-48{padding-bottom:192px}.sm\:sl-pl-48{padding-left:192px}.sm\:sl-pt-52{padding-top:208px}.sm\:sl-pr-52{padding-right:208px}.sm\:sl-pb-52{padding-bottom:208px}.sm\:sl-pl-52{padding-left:208px}.sm\:sl-pt-56{padding-top:224px}.sm\:sl-pr-56{padding-right:224px}.sm\:sl-pb-56{padding-bottom:224px}.sm\:sl-pl-56{padding-left:224px}.sm\:sl-pt-60{padding-top:240px}.sm\:sl-pr-60{padding-right:240px}.sm\:sl-pb-60{padding-bottom:240px}.sm\:sl-pl-60{padding-left:240px}.sm\:sl-pt-64{padding-top:256px}.sm\:sl-pr-64{padding-right:256px}.sm\:sl-pb-64{padding-bottom:256px}.sm\:sl-pl-64{padding-left:256px}.sm\:sl-pt-72{padding-top:288px}.sm\:sl-pr-72{padding-right:288px}.sm\:sl-pb-72{padding-bottom:288px}.sm\:sl-pl-72{padding-left:288px}.sm\:sl-pt-80{padding-top:320px}.sm\:sl-pr-80{padding-right:320px}.sm\:sl-pb-80{padding-bottom:320px}.sm\:sl-pl-80{padding-left:320px}.sm\:sl-pt-96{padding-top:384px}.sm\:sl-pr-96{padding-right:384px}.sm\:sl-pb-96{padding-bottom:384px}.sm\:sl-pl-96{padding-left:384px}.sm\:sl-pt-px{padding-top:1px}.sm\:sl-pr-px{padding-right:1px}.sm\:sl-pb-px{padding-bottom:1px}.sm\:sl-pl-px{padding-left:1px}.sm\:sl-pt-0\.5{padding-top:2px}.sm\:sl-pr-0\.5{padding-right:2px}.sm\:sl-pb-0\.5{padding-bottom:2px}.sm\:sl-pl-0\.5{padding-left:2px}.sm\:sl-pt-1\.5{padding-top:6px}.sm\:sl-pr-1\.5{padding-right:6px}.sm\:sl-pb-1\.5{padding-bottom:6px}.sm\:sl-pl-1\.5{padding-left:6px}.sm\:sl-pt-2\.5{padding-top:10px}.sm\:sl-pr-2\.5{padding-right:10px}.sm\:sl-pb-2\.5{padding-bottom:10px}.sm\:sl-pl-2\.5{padding-left:10px}.sm\:sl-pt-3\.5{padding-top:14px}.sm\:sl-pr-3\.5{padding-right:14px}.sm\:sl-pb-3\.5{padding-bottom:14px}.sm\:sl-pl-3\.5{padding-left:14px}.sm\:sl-pt-4\.5{padding-top:18px}.sm\:sl-pr-4\.5{padding-right:18px}.sm\:sl-pb-4\.5{padding-bottom:18px}.sm\:sl-pl-4\.5{padding-left:18px}.sm\:sl-static{position:static}.sm\:sl-fixed{position:fixed}.sm\:sl-absolute{position:absolute}.sm\:sl-relative{position:relative}.sm\:sl-sticky{position:-webkit-sticky;position:sticky}.sm\:sl-visible{visibility:visible}.sm\:sl-invisible{visibility:hidden}.sl-group:hover .sm\:group-hover\:sl-visible{visibility:visible}.sl-group:hover .sm\:group-hover\:sl-invisible{visibility:hidden}.sl-group:focus .sm\:group-focus\:sl-visible{visibility:visible}.sl-group:focus .sm\:group-focus\:sl-invisible{visibility:hidden}.sm\:sl-w-0{width:0}.sm\:sl-w-1{width:4px}.sm\:sl-w-2{width:8px}.sm\:sl-w-3{width:12px}.sm\:sl-w-4{width:16px}.sm\:sl-w-5{width:20px}.sm\:sl-w-6{width:24px}.sm\:sl-w-7{width:28px}.sm\:sl-w-8{width:32px}.sm\:sl-w-9{width:36px}.sm\:sl-w-10{width:40px}.sm\:sl-w-11{width:44px}.sm\:sl-w-12{width:48px}.sm\:sl-w-14{width:56px}.sm\:sl-w-16{width:64px}.sm\:sl-w-20{width:80px}.sm\:sl-w-24{width:96px}.sm\:sl-w-28{width:112px}.sm\:sl-w-32{width:128px}.sm\:sl-w-36{width:144px}.sm\:sl-w-40{width:160px}.sm\:sl-w-44{width:176px}.sm\:sl-w-48{width:192px}.sm\:sl-w-52{width:208px}.sm\:sl-w-56{width:224px}.sm\:sl-w-60{width:240px}.sm\:sl-w-64{width:256px}.sm\:sl-w-72{width:288px}.sm\:sl-w-80{width:320px}.sm\:sl-w-96{width:384px}.sm\:sl-w-auto{width:auto}.sm\:sl-w-px{width:1px}.sm\:sl-w-0\.5{width:2px}.sm\:sl-w-1\.5{width:6px}.sm\:sl-w-2\.5{width:10px}.sm\:sl-w-3\.5{width:14px}.sm\:sl-w-4\.5{width:18px}.sm\:sl-w-xs{width:20px}.sm\:sl-w-sm{width:24px}.sm\:sl-w-md{width:32px}.sm\:sl-w-lg{width:36px}.sm\:sl-w-xl{width:44px}.sm\:sl-w-2xl{width:52px}.sm\:sl-w-3xl{width:60px}.sm\:sl-w-1\/2{width:50%}.sm\:sl-w-1\/3{width:33.333333%}.sm\:sl-w-2\/3{width:66.666667%}.sm\:sl-w-1\/4{width:25%}.sm\:sl-w-2\/4{width:50%}.sm\:sl-w-3\/4{width:75%}.sm\:sl-w-1\/5{width:20%}.sm\:sl-w-2\/5{width:40%}.sm\:sl-w-3\/5{width:60%}.sm\:sl-w-4\/5{width:80%}.sm\:sl-w-1\/6{width:16.666667%}.sm\:sl-w-2\/6{width:33.333333%}.sm\:sl-w-3\/6{width:50%}.sm\:sl-w-4\/6{width:66.666667%}.sm\:sl-w-5\/6{width:83.333333%}.sm\:sl-w-full{width:100%}.sm\:sl-w-screen{width:100vw}.sm\:sl-w-min{width:-moz-min-content;width:min-content}.sm\:sl-w-max{width:-moz-max-content;width:max-content}}@media (max-width:767px){.sl-stack--horizontal.md\:sl-stack--1>:not(style)~:not(style){margin-left:4px}.sl-stack--vertical.md\:sl-stack--1>:not(style)~:not(style){margin-top:4px}.sl-stack--horizontal.md\:sl-stack--2>:not(style)~:not(style){margin-left:8px}.sl-stack--vertical.md\:sl-stack--2>:not(style)~:not(style){margin-top:8px}.sl-stack--horizontal.md\:sl-stack--3>:not(style)~:not(style){margin-left:12px}.sl-stack--vertical.md\:sl-stack--3>:not(style)~:not(style){margin-top:12px}.sl-stack--horizontal.md\:sl-stack--4>:not(style)~:not(style){margin-left:16px}.sl-stack--vertical.md\:sl-stack--4>:not(style)~:not(style){margin-top:16px}.sl-stack--horizontal.md\:sl-stack--5>:not(style)~:not(style){margin-left:20px}.sl-stack--vertical.md\:sl-stack--5>:not(style)~:not(style){margin-top:20px}.sl-stack--horizontal.md\:sl-stack--6>:not(style)~:not(style){margin-left:24px}.sl-stack--vertical.md\:sl-stack--6>:not(style)~:not(style){margin-top:24px}.sl-stack--horizontal.md\:sl-stack--7>:not(style)~:not(style){margin-left:28px}.sl-stack--vertical.md\:sl-stack--7>:not(style)~:not(style){margin-top:28px}.sl-stack--horizontal.md\:sl-stack--8>:not(style)~:not(style){margin-left:32px}.sl-stack--vertical.md\:sl-stack--8>:not(style)~:not(style){margin-top:32px}.sl-stack--horizontal.md\:sl-stack--9>:not(style)~:not(style){margin-left:36px}.sl-stack--vertical.md\:sl-stack--9>:not(style)~:not(style){margin-top:36px}.sl-stack--horizontal.md\:sl-stack--10>:not(style)~:not(style){margin-left:40px}.sl-stack--vertical.md\:sl-stack--10>:not(style)~:not(style){margin-top:40px}.sl-stack--horizontal.md\:sl-stack--12>:not(style)~:not(style){margin-left:48px}.sl-stack--vertical.md\:sl-stack--12>:not(style)~:not(style){margin-top:48px}.sl-stack--horizontal.md\:sl-stack--14>:not(style)~:not(style){margin-left:56px}.sl-stack--vertical.md\:sl-stack--14>:not(style)~:not(style){margin-top:56px}.sl-stack--horizontal.md\:sl-stack--16>:not(style)~:not(style){margin-left:64px}.sl-stack--vertical.md\:sl-stack--16>:not(style)~:not(style){margin-top:64px}.sl-stack--horizontal.md\:sl-stack--20>:not(style)~:not(style){margin-left:80px}.sl-stack--vertical.md\:sl-stack--20>:not(style)~:not(style){margin-top:80px}.sl-stack--horizontal.md\:sl-stack--24>:not(style)~:not(style){margin-left:96px}.sl-stack--vertical.md\:sl-stack--24>:not(style)~:not(style){margin-top:96px}.sl-stack--horizontal.md\:sl-stack--32>:not(style)~:not(style){margin-left:128px}.sl-stack--vertical.md\:sl-stack--32>:not(style)~:not(style){margin-top:128px}.md\:sl-content-center{align-content:center}.md\:sl-content-start{align-content:flex-start}.md\:sl-content-end{align-content:flex-end}.md\:sl-content-between{align-content:space-between}.md\:sl-content-around{align-content:space-around}.md\:sl-content-evenly{align-content:space-evenly}.md\:sl-items-start{align-items:flex-start}.md\:sl-items-end{align-items:flex-end}.md\:sl-items-center{align-items:center}.md\:sl-items-baseline{align-items:baseline}.md\:sl-items-stretch{align-items:stretch}.md\:sl-self-auto{align-self:auto}.md\:sl-self-start{align-self:flex-start}.md\:sl-self-end{align-self:flex-end}.md\:sl-self-center{align-self:center}.md\:sl-self-stretch{align-self:stretch}.md\:sl-blur-0,.md\:sl-blur-none{--tw-blur:blur(0)}.md\:sl-blur-sm{--tw-blur:blur(4px)}.md\:sl-blur{--tw-blur:blur(8px)}.md\:sl-blur-md{--tw-blur:blur(12px)}.md\:sl-blur-lg{--tw-blur:blur(16px)}.md\:sl-blur-xl{--tw-blur:blur(24px)}.md\:sl-blur-2xl{--tw-blur:blur(40px)}.md\:sl-blur-3xl{--tw-blur:blur(64px)}.md\:sl-block{display:block}.md\:sl-inline-block{display:inline-block}.md\:sl-inline{display:inline}.md\:sl-flex{display:flex}.md\:sl-inline-flex{display:inline-flex}.md\:sl-table{display:table}.md\:sl-inline-table{display:inline-table}.md\:sl-table-caption{display:table-caption}.md\:sl-table-cell{display:table-cell}.md\:sl-table-column{display:table-column}.md\:sl-table-column-group{display:table-column-group}.md\:sl-table-footer-group{display:table-footer-group}.md\:sl-table-header-group{display:table-header-group}.md\:sl-table-row-group{display:table-row-group}.md\:sl-table-row{display:table-row}.md\:sl-flow-root{display:flow-root}.md\:sl-grid{display:grid}.md\:sl-inline-grid{display:inline-grid}.md\:sl-contents{display:contents}.md\:sl-list-item{display:list-item}.md\:sl-hidden{display:none}.md\:sl-drop-shadow{--tw-drop-shadow:drop-shadow(var(--drop-shadow-default1)) drop-shadow(var(--drop-shadow-default2))}.md\:sl-flex-1{flex:1 1}.md\:sl-flex-auto{flex:1 1 auto}.md\:sl-flex-initial{flex:0 1 auto}.md\:sl-flex-none{flex:none}.md\:sl-flex-row{flex-direction:row}.md\:sl-flex-row-reverse{flex-direction:row-reverse}.md\:sl-flex-col{flex-direction:column}.md\:sl-flex-col-reverse{flex-direction:column-reverse}.md\:sl-flex-grow-0{flex-grow:0}.md\:sl-flex-grow{flex-grow:1}.md\:sl-flex-shrink-0{flex-shrink:0}.md\:sl-flex-shrink{flex-shrink:1}.md\:sl-flex-wrap{flex-wrap:wrap}.md\:sl-flex-wrap-reverse{flex-wrap:wrap-reverse}.md\:sl-flex-nowrap{flex-wrap:nowrap}.md\:sl-h-0{height:0}.md\:sl-h-1{height:4px}.md\:sl-h-2{height:8px}.md\:sl-h-3{height:12px}.md\:sl-h-4{height:16px}.md\:sl-h-5{height:20px}.md\:sl-h-6{height:24px}.md\:sl-h-7{height:28px}.md\:sl-h-8{height:32px}.md\:sl-h-9{height:36px}.md\:sl-h-10{height:40px}.md\:sl-h-11{height:44px}.md\:sl-h-12{height:48px}.md\:sl-h-14{height:56px}.md\:sl-h-16{height:64px}.md\:sl-h-20{height:80px}.md\:sl-h-24{height:96px}.md\:sl-h-28{height:112px}.md\:sl-h-32{height:128px}.md\:sl-h-36{height:144px}.md\:sl-h-40{height:160px}.md\:sl-h-44{height:176px}.md\:sl-h-48{height:192px}.md\:sl-h-52{height:208px}.md\:sl-h-56{height:224px}.md\:sl-h-60{height:240px}.md\:sl-h-64{height:256px}.md\:sl-h-72{height:288px}.md\:sl-h-80{height:320px}.md\:sl-h-96{height:384px}.md\:sl-h-auto{height:auto}.md\:sl-h-px{height:1px}.md\:sl-h-0\.5{height:2px}.md\:sl-h-1\.5{height:6px}.md\:sl-h-2\.5{height:10px}.md\:sl-h-3\.5{height:14px}.md\:sl-h-4\.5{height:18px}.md\:sl-h-xs{height:20px}.md\:sl-h-sm{height:24px}.md\:sl-h-md{height:32px}.md\:sl-h-lg{height:36px}.md\:sl-h-xl{height:44px}.md\:sl-h-2xl{height:52px}.md\:sl-h-3xl{height:60px}.md\:sl-h-full{height:100%}.md\:sl-h-screen{height:100vh}.md\:sl-justify-start{justify-content:flex-start}.md\:sl-justify-end{justify-content:flex-end}.md\:sl-justify-center{justify-content:center}.md\:sl-justify-between{justify-content:space-between}.md\:sl-justify-around{justify-content:space-around}.md\:sl-justify-evenly{justify-content:space-evenly}.md\:sl-justify-items-start{justify-items:start}.md\:sl-justify-items-end{justify-items:end}.md\:sl-justify-items-center{justify-items:center}.md\:sl-justify-items-stretch{justify-items:stretch}.md\:sl-justify-self-auto{justify-self:auto}.md\:sl-justify-self-start{justify-self:start}.md\:sl-justify-self-end{justify-self:end}.md\:sl-justify-self-center{justify-self:center}.md\:sl-justify-self-stretch{justify-self:stretch}.md\:sl-m-0{margin:0}.md\:sl-m-1{margin:4px}.md\:sl-m-2{margin:8px}.md\:sl-m-3{margin:12px}.md\:sl-m-4{margin:16px}.md\:sl-m-5{margin:20px}.md\:sl-m-6{margin:24px}.md\:sl-m-7{margin:28px}.md\:sl-m-8{margin:32px}.md\:sl-m-9{margin:36px}.md\:sl-m-10{margin:40px}.md\:sl-m-11{margin:44px}.md\:sl-m-12{margin:48px}.md\:sl-m-14{margin:56px}.md\:sl-m-16{margin:64px}.md\:sl-m-20{margin:80px}.md\:sl-m-24{margin:96px}.md\:sl-m-28{margin:112px}.md\:sl-m-32{margin:128px}.md\:sl-m-36{margin:144px}.md\:sl-m-40{margin:160px}.md\:sl-m-44{margin:176px}.md\:sl-m-48{margin:192px}.md\:sl-m-52{margin:208px}.md\:sl-m-56{margin:224px}.md\:sl-m-60{margin:240px}.md\:sl-m-64{margin:256px}.md\:sl-m-72{margin:288px}.md\:sl-m-80{margin:320px}.md\:sl-m-96{margin:384px}.md\:sl-m-auto{margin:auto}.md\:sl-m-px{margin:1px}.md\:sl-m-0\.5{margin:2px}.md\:sl-m-1\.5{margin:6px}.md\:sl-m-2\.5{margin:10px}.md\:sl-m-3\.5{margin:14px}.md\:sl-m-4\.5{margin:18px}.md\:sl--m-0{margin:0}.md\:sl--m-1{margin:-4px}.md\:sl--m-2{margin:-8px}.md\:sl--m-3{margin:-12px}.md\:sl--m-4{margin:-16px}.md\:sl--m-5{margin:-20px}.md\:sl--m-6{margin:-24px}.md\:sl--m-7{margin:-28px}.md\:sl--m-8{margin:-32px}.md\:sl--m-9{margin:-36px}.md\:sl--m-10{margin:-40px}.md\:sl--m-11{margin:-44px}.md\:sl--m-12{margin:-48px}.md\:sl--m-14{margin:-56px}.md\:sl--m-16{margin:-64px}.md\:sl--m-20{margin:-80px}.md\:sl--m-24{margin:-96px}.md\:sl--m-28{margin:-112px}.md\:sl--m-32{margin:-128px}.md\:sl--m-36{margin:-144px}.md\:sl--m-40{margin:-160px}.md\:sl--m-44{margin:-176px}.md\:sl--m-48{margin:-192px}.md\:sl--m-52{margin:-208px}.md\:sl--m-56{margin:-224px}.md\:sl--m-60{margin:-240px}.md\:sl--m-64{margin:-256px}.md\:sl--m-72{margin:-288px}.md\:sl--m-80{margin:-320px}.md\:sl--m-96{margin:-384px}.md\:sl--m-px{margin:-1px}.md\:sl--m-0\.5{margin:-2px}.md\:sl--m-1\.5{margin:-6px}.md\:sl--m-2\.5{margin:-10px}.md\:sl--m-3\.5{margin:-14px}.md\:sl--m-4\.5{margin:-18px}.md\:sl-my-0{margin-bottom:0;margin-top:0}.md\:sl-mx-0{margin-left:0;margin-right:0}.md\:sl-my-1{margin-bottom:4px;margin-top:4px}.md\:sl-mx-1{margin-left:4px;margin-right:4px}.md\:sl-my-2{margin-bottom:8px;margin-top:8px}.md\:sl-mx-2{margin-left:8px;margin-right:8px}.md\:sl-my-3{margin-bottom:12px;margin-top:12px}.md\:sl-mx-3{margin-left:12px;margin-right:12px}.md\:sl-my-4{margin-bottom:16px;margin-top:16px}.md\:sl-mx-4{margin-left:16px;margin-right:16px}.md\:sl-my-5{margin-bottom:20px;margin-top:20px}.md\:sl-mx-5{margin-left:20px;margin-right:20px}.md\:sl-my-6{margin-bottom:24px;margin-top:24px}.md\:sl-mx-6{margin-left:24px;margin-right:24px}.md\:sl-my-7{margin-bottom:28px;margin-top:28px}.md\:sl-mx-7{margin-left:28px;margin-right:28px}.md\:sl-my-8{margin-bottom:32px;margin-top:32px}.md\:sl-mx-8{margin-left:32px;margin-right:32px}.md\:sl-my-9{margin-bottom:36px;margin-top:36px}.md\:sl-mx-9{margin-left:36px;margin-right:36px}.md\:sl-my-10{margin-bottom:40px;margin-top:40px}.md\:sl-mx-10{margin-left:40px;margin-right:40px}.md\:sl-my-11{margin-bottom:44px;margin-top:44px}.md\:sl-mx-11{margin-left:44px;margin-right:44px}.md\:sl-my-12{margin-bottom:48px;margin-top:48px}.md\:sl-mx-12{margin-left:48px;margin-right:48px}.md\:sl-my-14{margin-bottom:56px;margin-top:56px}.md\:sl-mx-14{margin-left:56px;margin-right:56px}.md\:sl-my-16{margin-bottom:64px;margin-top:64px}.md\:sl-mx-16{margin-left:64px;margin-right:64px}.md\:sl-my-20{margin-bottom:80px;margin-top:80px}.md\:sl-mx-20{margin-left:80px;margin-right:80px}.md\:sl-my-24{margin-bottom:96px;margin-top:96px}.md\:sl-mx-24{margin-left:96px;margin-right:96px}.md\:sl-my-28{margin-bottom:112px;margin-top:112px}.md\:sl-mx-28{margin-left:112px;margin-right:112px}.md\:sl-my-32{margin-bottom:128px;margin-top:128px}.md\:sl-mx-32{margin-left:128px;margin-right:128px}.md\:sl-my-36{margin-bottom:144px;margin-top:144px}.md\:sl-mx-36{margin-left:144px;margin-right:144px}.md\:sl-my-40{margin-bottom:160px;margin-top:160px}.md\:sl-mx-40{margin-left:160px;margin-right:160px}.md\:sl-my-44{margin-bottom:176px;margin-top:176px}.md\:sl-mx-44{margin-left:176px;margin-right:176px}.md\:sl-my-48{margin-bottom:192px;margin-top:192px}.md\:sl-mx-48{margin-left:192px;margin-right:192px}.md\:sl-my-52{margin-bottom:208px;margin-top:208px}.md\:sl-mx-52{margin-left:208px;margin-right:208px}.md\:sl-my-56{margin-bottom:224px;margin-top:224px}.md\:sl-mx-56{margin-left:224px;margin-right:224px}.md\:sl-my-60{margin-bottom:240px;margin-top:240px}.md\:sl-mx-60{margin-left:240px;margin-right:240px}.md\:sl-my-64{margin-bottom:256px;margin-top:256px}.md\:sl-mx-64{margin-left:256px;margin-right:256px}.md\:sl-my-72{margin-bottom:288px;margin-top:288px}.md\:sl-mx-72{margin-left:288px;margin-right:288px}.md\:sl-my-80{margin-bottom:320px;margin-top:320px}.md\:sl-mx-80{margin-left:320px;margin-right:320px}.md\:sl-my-96{margin-bottom:384px;margin-top:384px}.md\:sl-mx-96{margin-left:384px;margin-right:384px}.md\:sl-my-auto{margin-bottom:auto;margin-top:auto}.md\:sl-mx-auto{margin-left:auto;margin-right:auto}.md\:sl-my-px{margin-bottom:1px;margin-top:1px}.md\:sl-mx-px{margin-left:1px;margin-right:1px}.md\:sl-my-0\.5{margin-bottom:2px;margin-top:2px}.md\:sl-mx-0\.5{margin-left:2px;margin-right:2px}.md\:sl-my-1\.5{margin-bottom:6px;margin-top:6px}.md\:sl-mx-1\.5{margin-left:6px;margin-right:6px}.md\:sl-my-2\.5{margin-bottom:10px;margin-top:10px}.md\:sl-mx-2\.5{margin-left:10px;margin-right:10px}.md\:sl-my-3\.5{margin-bottom:14px;margin-top:14px}.md\:sl-mx-3\.5{margin-left:14px;margin-right:14px}.md\:sl-my-4\.5{margin-bottom:18px;margin-top:18px}.md\:sl-mx-4\.5{margin-left:18px;margin-right:18px}.md\:sl--my-0{margin-bottom:0;margin-top:0}.md\:sl--mx-0{margin-left:0;margin-right:0}.md\:sl--my-1{margin-bottom:-4px;margin-top:-4px}.md\:sl--mx-1{margin-left:-4px;margin-right:-4px}.md\:sl--my-2{margin-bottom:-8px;margin-top:-8px}.md\:sl--mx-2{margin-left:-8px;margin-right:-8px}.md\:sl--my-3{margin-bottom:-12px;margin-top:-12px}.md\:sl--mx-3{margin-left:-12px;margin-right:-12px}.md\:sl--my-4{margin-bottom:-16px;margin-top:-16px}.md\:sl--mx-4{margin-left:-16px;margin-right:-16px}.md\:sl--my-5{margin-bottom:-20px;margin-top:-20px}.md\:sl--mx-5{margin-left:-20px;margin-right:-20px}.md\:sl--my-6{margin-bottom:-24px;margin-top:-24px}.md\:sl--mx-6{margin-left:-24px;margin-right:-24px}.md\:sl--my-7{margin-bottom:-28px;margin-top:-28px}.md\:sl--mx-7{margin-left:-28px;margin-right:-28px}.md\:sl--my-8{margin-bottom:-32px;margin-top:-32px}.md\:sl--mx-8{margin-left:-32px;margin-right:-32px}.md\:sl--my-9{margin-bottom:-36px;margin-top:-36px}.md\:sl--mx-9{margin-left:-36px;margin-right:-36px}.md\:sl--my-10{margin-bottom:-40px;margin-top:-40px}.md\:sl--mx-10{margin-left:-40px;margin-right:-40px}.md\:sl--my-11{margin-bottom:-44px;margin-top:-44px}.md\:sl--mx-11{margin-left:-44px;margin-right:-44px}.md\:sl--my-12{margin-bottom:-48px;margin-top:-48px}.md\:sl--mx-12{margin-left:-48px;margin-right:-48px}.md\:sl--my-14{margin-bottom:-56px;margin-top:-56px}.md\:sl--mx-14{margin-left:-56px;margin-right:-56px}.md\:sl--my-16{margin-bottom:-64px;margin-top:-64px}.md\:sl--mx-16{margin-left:-64px;margin-right:-64px}.md\:sl--my-20{margin-bottom:-80px;margin-top:-80px}.md\:sl--mx-20{margin-left:-80px;margin-right:-80px}.md\:sl--my-24{margin-bottom:-96px;margin-top:-96px}.md\:sl--mx-24{margin-left:-96px;margin-right:-96px}.md\:sl--my-28{margin-bottom:-112px;margin-top:-112px}.md\:sl--mx-28{margin-left:-112px;margin-right:-112px}.md\:sl--my-32{margin-bottom:-128px;margin-top:-128px}.md\:sl--mx-32{margin-left:-128px;margin-right:-128px}.md\:sl--my-36{margin-bottom:-144px;margin-top:-144px}.md\:sl--mx-36{margin-left:-144px;margin-right:-144px}.md\:sl--my-40{margin-bottom:-160px;margin-top:-160px}.md\:sl--mx-40{margin-left:-160px;margin-right:-160px}.md\:sl--my-44{margin-bottom:-176px;margin-top:-176px}.md\:sl--mx-44{margin-left:-176px;margin-right:-176px}.md\:sl--my-48{margin-bottom:-192px;margin-top:-192px}.md\:sl--mx-48{margin-left:-192px;margin-right:-192px}.md\:sl--my-52{margin-bottom:-208px;margin-top:-208px}.md\:sl--mx-52{margin-left:-208px;margin-right:-208px}.md\:sl--my-56{margin-bottom:-224px;margin-top:-224px}.md\:sl--mx-56{margin-left:-224px;margin-right:-224px}.md\:sl--my-60{margin-bottom:-240px;margin-top:-240px}.md\:sl--mx-60{margin-left:-240px;margin-right:-240px}.md\:sl--my-64{margin-bottom:-256px;margin-top:-256px}.md\:sl--mx-64{margin-left:-256px;margin-right:-256px}.md\:sl--my-72{margin-bottom:-288px;margin-top:-288px}.md\:sl--mx-72{margin-left:-288px;margin-right:-288px}.md\:sl--my-80{margin-bottom:-320px;margin-top:-320px}.md\:sl--mx-80{margin-left:-320px;margin-right:-320px}.md\:sl--my-96{margin-bottom:-384px;margin-top:-384px}.md\:sl--mx-96{margin-left:-384px;margin-right:-384px}.md\:sl--my-px{margin-bottom:-1px;margin-top:-1px}.md\:sl--mx-px{margin-left:-1px;margin-right:-1px}.md\:sl--my-0\.5{margin-bottom:-2px;margin-top:-2px}.md\:sl--mx-0\.5{margin-left:-2px;margin-right:-2px}.md\:sl--my-1\.5{margin-bottom:-6px;margin-top:-6px}.md\:sl--mx-1\.5{margin-left:-6px;margin-right:-6px}.md\:sl--my-2\.5{margin-bottom:-10px;margin-top:-10px}.md\:sl--mx-2\.5{margin-left:-10px;margin-right:-10px}.md\:sl--my-3\.5{margin-bottom:-14px;margin-top:-14px}.md\:sl--mx-3\.5{margin-left:-14px;margin-right:-14px}.md\:sl--my-4\.5{margin-bottom:-18px;margin-top:-18px}.md\:sl--mx-4\.5{margin-left:-18px;margin-right:-18px}.md\:sl-mt-0{margin-top:0}.md\:sl-mr-0{margin-right:0}.md\:sl-mb-0{margin-bottom:0}.md\:sl-ml-0{margin-left:0}.md\:sl-mt-1{margin-top:4px}.md\:sl-mr-1{margin-right:4px}.md\:sl-mb-1{margin-bottom:4px}.md\:sl-ml-1{margin-left:4px}.md\:sl-mt-2{margin-top:8px}.md\:sl-mr-2{margin-right:8px}.md\:sl-mb-2{margin-bottom:8px}.md\:sl-ml-2{margin-left:8px}.md\:sl-mt-3{margin-top:12px}.md\:sl-mr-3{margin-right:12px}.md\:sl-mb-3{margin-bottom:12px}.md\:sl-ml-3{margin-left:12px}.md\:sl-mt-4{margin-top:16px}.md\:sl-mr-4{margin-right:16px}.md\:sl-mb-4{margin-bottom:16px}.md\:sl-ml-4{margin-left:16px}.md\:sl-mt-5{margin-top:20px}.md\:sl-mr-5{margin-right:20px}.md\:sl-mb-5{margin-bottom:20px}.md\:sl-ml-5{margin-left:20px}.md\:sl-mt-6{margin-top:24px}.md\:sl-mr-6{margin-right:24px}.md\:sl-mb-6{margin-bottom:24px}.md\:sl-ml-6{margin-left:24px}.md\:sl-mt-7{margin-top:28px}.md\:sl-mr-7{margin-right:28px}.md\:sl-mb-7{margin-bottom:28px}.md\:sl-ml-7{margin-left:28px}.md\:sl-mt-8{margin-top:32px}.md\:sl-mr-8{margin-right:32px}.md\:sl-mb-8{margin-bottom:32px}.md\:sl-ml-8{margin-left:32px}.md\:sl-mt-9{margin-top:36px}.md\:sl-mr-9{margin-right:36px}.md\:sl-mb-9{margin-bottom:36px}.md\:sl-ml-9{margin-left:36px}.md\:sl-mt-10{margin-top:40px}.md\:sl-mr-10{margin-right:40px}.md\:sl-mb-10{margin-bottom:40px}.md\:sl-ml-10{margin-left:40px}.md\:sl-mt-11{margin-top:44px}.md\:sl-mr-11{margin-right:44px}.md\:sl-mb-11{margin-bottom:44px}.md\:sl-ml-11{margin-left:44px}.md\:sl-mt-12{margin-top:48px}.md\:sl-mr-12{margin-right:48px}.md\:sl-mb-12{margin-bottom:48px}.md\:sl-ml-12{margin-left:48px}.md\:sl-mt-14{margin-top:56px}.md\:sl-mr-14{margin-right:56px}.md\:sl-mb-14{margin-bottom:56px}.md\:sl-ml-14{margin-left:56px}.md\:sl-mt-16{margin-top:64px}.md\:sl-mr-16{margin-right:64px}.md\:sl-mb-16{margin-bottom:64px}.md\:sl-ml-16{margin-left:64px}.md\:sl-mt-20{margin-top:80px}.md\:sl-mr-20{margin-right:80px}.md\:sl-mb-20{margin-bottom:80px}.md\:sl-ml-20{margin-left:80px}.md\:sl-mt-24{margin-top:96px}.md\:sl-mr-24{margin-right:96px}.md\:sl-mb-24{margin-bottom:96px}.md\:sl-ml-24{margin-left:96px}.md\:sl-mt-28{margin-top:112px}.md\:sl-mr-28{margin-right:112px}.md\:sl-mb-28{margin-bottom:112px}.md\:sl-ml-28{margin-left:112px}.md\:sl-mt-32{margin-top:128px}.md\:sl-mr-32{margin-right:128px}.md\:sl-mb-32{margin-bottom:128px}.md\:sl-ml-32{margin-left:128px}.md\:sl-mt-36{margin-top:144px}.md\:sl-mr-36{margin-right:144px}.md\:sl-mb-36{margin-bottom:144px}.md\:sl-ml-36{margin-left:144px}.md\:sl-mt-40{margin-top:160px}.md\:sl-mr-40{margin-right:160px}.md\:sl-mb-40{margin-bottom:160px}.md\:sl-ml-40{margin-left:160px}.md\:sl-mt-44{margin-top:176px}.md\:sl-mr-44{margin-right:176px}.md\:sl-mb-44{margin-bottom:176px}.md\:sl-ml-44{margin-left:176px}.md\:sl-mt-48{margin-top:192px}.md\:sl-mr-48{margin-right:192px}.md\:sl-mb-48{margin-bottom:192px}.md\:sl-ml-48{margin-left:192px}.md\:sl-mt-52{margin-top:208px}.md\:sl-mr-52{margin-right:208px}.md\:sl-mb-52{margin-bottom:208px}.md\:sl-ml-52{margin-left:208px}.md\:sl-mt-56{margin-top:224px}.md\:sl-mr-56{margin-right:224px}.md\:sl-mb-56{margin-bottom:224px}.md\:sl-ml-56{margin-left:224px}.md\:sl-mt-60{margin-top:240px}.md\:sl-mr-60{margin-right:240px}.md\:sl-mb-60{margin-bottom:240px}.md\:sl-ml-60{margin-left:240px}.md\:sl-mt-64{margin-top:256px}.md\:sl-mr-64{margin-right:256px}.md\:sl-mb-64{margin-bottom:256px}.md\:sl-ml-64{margin-left:256px}.md\:sl-mt-72{margin-top:288px}.md\:sl-mr-72{margin-right:288px}.md\:sl-mb-72{margin-bottom:288px}.md\:sl-ml-72{margin-left:288px}.md\:sl-mt-80{margin-top:320px}.md\:sl-mr-80{margin-right:320px}.md\:sl-mb-80{margin-bottom:320px}.md\:sl-ml-80{margin-left:320px}.md\:sl-mt-96{margin-top:384px}.md\:sl-mr-96{margin-right:384px}.md\:sl-mb-96{margin-bottom:384px}.md\:sl-ml-96{margin-left:384px}.md\:sl-mt-auto{margin-top:auto}.md\:sl-mr-auto{margin-right:auto}.md\:sl-mb-auto{margin-bottom:auto}.md\:sl-ml-auto{margin-left:auto}.md\:sl-mt-px{margin-top:1px}.md\:sl-mr-px{margin-right:1px}.md\:sl-mb-px{margin-bottom:1px}.md\:sl-ml-px{margin-left:1px}.md\:sl-mt-0\.5{margin-top:2px}.md\:sl-mr-0\.5{margin-right:2px}.md\:sl-mb-0\.5{margin-bottom:2px}.md\:sl-ml-0\.5{margin-left:2px}.md\:sl-mt-1\.5{margin-top:6px}.md\:sl-mr-1\.5{margin-right:6px}.md\:sl-mb-1\.5{margin-bottom:6px}.md\:sl-ml-1\.5{margin-left:6px}.md\:sl-mt-2\.5{margin-top:10px}.md\:sl-mr-2\.5{margin-right:10px}.md\:sl-mb-2\.5{margin-bottom:10px}.md\:sl-ml-2\.5{margin-left:10px}.md\:sl-mt-3\.5{margin-top:14px}.md\:sl-mr-3\.5{margin-right:14px}.md\:sl-mb-3\.5{margin-bottom:14px}.md\:sl-ml-3\.5{margin-left:14px}.md\:sl-mt-4\.5{margin-top:18px}.md\:sl-mr-4\.5{margin-right:18px}.md\:sl-mb-4\.5{margin-bottom:18px}.md\:sl-ml-4\.5{margin-left:18px}.md\:sl--mt-0{margin-top:0}.md\:sl--mr-0{margin-right:0}.md\:sl--mb-0{margin-bottom:0}.md\:sl--ml-0{margin-left:0}.md\:sl--mt-1{margin-top:-4px}.md\:sl--mr-1{margin-right:-4px}.md\:sl--mb-1{margin-bottom:-4px}.md\:sl--ml-1{margin-left:-4px}.md\:sl--mt-2{margin-top:-8px}.md\:sl--mr-2{margin-right:-8px}.md\:sl--mb-2{margin-bottom:-8px}.md\:sl--ml-2{margin-left:-8px}.md\:sl--mt-3{margin-top:-12px}.md\:sl--mr-3{margin-right:-12px}.md\:sl--mb-3{margin-bottom:-12px}.md\:sl--ml-3{margin-left:-12px}.md\:sl--mt-4{margin-top:-16px}.md\:sl--mr-4{margin-right:-16px}.md\:sl--mb-4{margin-bottom:-16px}.md\:sl--ml-4{margin-left:-16px}.md\:sl--mt-5{margin-top:-20px}.md\:sl--mr-5{margin-right:-20px}.md\:sl--mb-5{margin-bottom:-20px}.md\:sl--ml-5{margin-left:-20px}.md\:sl--mt-6{margin-top:-24px}.md\:sl--mr-6{margin-right:-24px}.md\:sl--mb-6{margin-bottom:-24px}.md\:sl--ml-6{margin-left:-24px}.md\:sl--mt-7{margin-top:-28px}.md\:sl--mr-7{margin-right:-28px}.md\:sl--mb-7{margin-bottom:-28px}.md\:sl--ml-7{margin-left:-28px}.md\:sl--mt-8{margin-top:-32px}.md\:sl--mr-8{margin-right:-32px}.md\:sl--mb-8{margin-bottom:-32px}.md\:sl--ml-8{margin-left:-32px}.md\:sl--mt-9{margin-top:-36px}.md\:sl--mr-9{margin-right:-36px}.md\:sl--mb-9{margin-bottom:-36px}.md\:sl--ml-9{margin-left:-36px}.md\:sl--mt-10{margin-top:-40px}.md\:sl--mr-10{margin-right:-40px}.md\:sl--mb-10{margin-bottom:-40px}.md\:sl--ml-10{margin-left:-40px}.md\:sl--mt-11{margin-top:-44px}.md\:sl--mr-11{margin-right:-44px}.md\:sl--mb-11{margin-bottom:-44px}.md\:sl--ml-11{margin-left:-44px}.md\:sl--mt-12{margin-top:-48px}.md\:sl--mr-12{margin-right:-48px}.md\:sl--mb-12{margin-bottom:-48px}.md\:sl--ml-12{margin-left:-48px}.md\:sl--mt-14{margin-top:-56px}.md\:sl--mr-14{margin-right:-56px}.md\:sl--mb-14{margin-bottom:-56px}.md\:sl--ml-14{margin-left:-56px}.md\:sl--mt-16{margin-top:-64px}.md\:sl--mr-16{margin-right:-64px}.md\:sl--mb-16{margin-bottom:-64px}.md\:sl--ml-16{margin-left:-64px}.md\:sl--mt-20{margin-top:-80px}.md\:sl--mr-20{margin-right:-80px}.md\:sl--mb-20{margin-bottom:-80px}.md\:sl--ml-20{margin-left:-80px}.md\:sl--mt-24{margin-top:-96px}.md\:sl--mr-24{margin-right:-96px}.md\:sl--mb-24{margin-bottom:-96px}.md\:sl--ml-24{margin-left:-96px}.md\:sl--mt-28{margin-top:-112px}.md\:sl--mr-28{margin-right:-112px}.md\:sl--mb-28{margin-bottom:-112px}.md\:sl--ml-28{margin-left:-112px}.md\:sl--mt-32{margin-top:-128px}.md\:sl--mr-32{margin-right:-128px}.md\:sl--mb-32{margin-bottom:-128px}.md\:sl--ml-32{margin-left:-128px}.md\:sl--mt-36{margin-top:-144px}.md\:sl--mr-36{margin-right:-144px}.md\:sl--mb-36{margin-bottom:-144px}.md\:sl--ml-36{margin-left:-144px}.md\:sl--mt-40{margin-top:-160px}.md\:sl--mr-40{margin-right:-160px}.md\:sl--mb-40{margin-bottom:-160px}.md\:sl--ml-40{margin-left:-160px}.md\:sl--mt-44{margin-top:-176px}.md\:sl--mr-44{margin-right:-176px}.md\:sl--mb-44{margin-bottom:-176px}.md\:sl--ml-44{margin-left:-176px}.md\:sl--mt-48{margin-top:-192px}.md\:sl--mr-48{margin-right:-192px}.md\:sl--mb-48{margin-bottom:-192px}.md\:sl--ml-48{margin-left:-192px}.md\:sl--mt-52{margin-top:-208px}.md\:sl--mr-52{margin-right:-208px}.md\:sl--mb-52{margin-bottom:-208px}.md\:sl--ml-52{margin-left:-208px}.md\:sl--mt-56{margin-top:-224px}.md\:sl--mr-56{margin-right:-224px}.md\:sl--mb-56{margin-bottom:-224px}.md\:sl--ml-56{margin-left:-224px}.md\:sl--mt-60{margin-top:-240px}.md\:sl--mr-60{margin-right:-240px}.md\:sl--mb-60{margin-bottom:-240px}.md\:sl--ml-60{margin-left:-240px}.md\:sl--mt-64{margin-top:-256px}.md\:sl--mr-64{margin-right:-256px}.md\:sl--mb-64{margin-bottom:-256px}.md\:sl--ml-64{margin-left:-256px}.md\:sl--mt-72{margin-top:-288px}.md\:sl--mr-72{margin-right:-288px}.md\:sl--mb-72{margin-bottom:-288px}.md\:sl--ml-72{margin-left:-288px}.md\:sl--mt-80{margin-top:-320px}.md\:sl--mr-80{margin-right:-320px}.md\:sl--mb-80{margin-bottom:-320px}.md\:sl--ml-80{margin-left:-320px}.md\:sl--mt-96{margin-top:-384px}.md\:sl--mr-96{margin-right:-384px}.md\:sl--mb-96{margin-bottom:-384px}.md\:sl--ml-96{margin-left:-384px}.md\:sl--mt-px{margin-top:-1px}.md\:sl--mr-px{margin-right:-1px}.md\:sl--mb-px{margin-bottom:-1px}.md\:sl--ml-px{margin-left:-1px}.md\:sl--mt-0\.5{margin-top:-2px}.md\:sl--mr-0\.5{margin-right:-2px}.md\:sl--mb-0\.5{margin-bottom:-2px}.md\:sl--ml-0\.5{margin-left:-2px}.md\:sl--mt-1\.5{margin-top:-6px}.md\:sl--mr-1\.5{margin-right:-6px}.md\:sl--mb-1\.5{margin-bottom:-6px}.md\:sl--ml-1\.5{margin-left:-6px}.md\:sl--mt-2\.5{margin-top:-10px}.md\:sl--mr-2\.5{margin-right:-10px}.md\:sl--mb-2\.5{margin-bottom:-10px}.md\:sl--ml-2\.5{margin-left:-10px}.md\:sl--mt-3\.5{margin-top:-14px}.md\:sl--mr-3\.5{margin-right:-14px}.md\:sl--mb-3\.5{margin-bottom:-14px}.md\:sl--ml-3\.5{margin-left:-14px}.md\:sl--mt-4\.5{margin-top:-18px}.md\:sl--mr-4\.5{margin-right:-18px}.md\:sl--mb-4\.5{margin-bottom:-18px}.md\:sl--ml-4\.5{margin-left:-18px}.md\:sl-max-h-full{max-height:100%}.md\:sl-max-h-screen{max-height:100vh}.md\:sl-max-w-none{max-width:none}.md\:sl-max-w-full{max-width:100%}.md\:sl-max-w-min{max-width:-moz-min-content;max-width:min-content}.md\:sl-max-w-max{max-width:-moz-max-content;max-width:max-content}.md\:sl-max-w-prose{max-width:65ch}.md\:sl-min-h-full{min-height:100%}.md\:sl-min-h-screen{min-height:100vh}.md\:sl-min-w-full{min-width:100%}.md\:sl-min-w-min{min-width:-moz-min-content;min-width:min-content}.md\:sl-min-w-max{min-width:-moz-max-content;min-width:max-content}.md\:sl-p-0{padding:0}.md\:sl-p-1{padding:4px}.md\:sl-p-2{padding:8px}.md\:sl-p-3{padding:12px}.md\:sl-p-4{padding:16px}.md\:sl-p-5{padding:20px}.md\:sl-p-6{padding:24px}.md\:sl-p-7{padding:28px}.md\:sl-p-8{padding:32px}.md\:sl-p-9{padding:36px}.md\:sl-p-10{padding:40px}.md\:sl-p-11{padding:44px}.md\:sl-p-12{padding:48px}.md\:sl-p-14{padding:56px}.md\:sl-p-16{padding:64px}.md\:sl-p-20{padding:80px}.md\:sl-p-24{padding:96px}.md\:sl-p-28{padding:112px}.md\:sl-p-32{padding:128px}.md\:sl-p-36{padding:144px}.md\:sl-p-40{padding:160px}.md\:sl-p-44{padding:176px}.md\:sl-p-48{padding:192px}.md\:sl-p-52{padding:208px}.md\:sl-p-56{padding:224px}.md\:sl-p-60{padding:240px}.md\:sl-p-64{padding:256px}.md\:sl-p-72{padding:288px}.md\:sl-p-80{padding:320px}.md\:sl-p-96{padding:384px}.md\:sl-p-px{padding:1px}.md\:sl-p-0\.5{padding:2px}.md\:sl-p-1\.5{padding:6px}.md\:sl-p-2\.5{padding:10px}.md\:sl-p-3\.5{padding:14px}.md\:sl-p-4\.5{padding:18px}.md\:sl-py-0{padding-bottom:0;padding-top:0}.md\:sl-px-0{padding-left:0;padding-right:0}.md\:sl-py-1{padding-bottom:4px;padding-top:4px}.md\:sl-px-1{padding-left:4px;padding-right:4px}.md\:sl-py-2{padding-bottom:8px;padding-top:8px}.md\:sl-px-2{padding-left:8px;padding-right:8px}.md\:sl-py-3{padding-bottom:12px;padding-top:12px}.md\:sl-px-3{padding-left:12px;padding-right:12px}.md\:sl-py-4{padding-bottom:16px;padding-top:16px}.md\:sl-px-4{padding-left:16px;padding-right:16px}.md\:sl-py-5{padding-bottom:20px;padding-top:20px}.md\:sl-px-5{padding-left:20px;padding-right:20px}.md\:sl-py-6{padding-bottom:24px;padding-top:24px}.md\:sl-px-6{padding-left:24px;padding-right:24px}.md\:sl-py-7{padding-bottom:28px;padding-top:28px}.md\:sl-px-7{padding-left:28px;padding-right:28px}.md\:sl-py-8{padding-bottom:32px;padding-top:32px}.md\:sl-px-8{padding-left:32px;padding-right:32px}.md\:sl-py-9{padding-bottom:36px;padding-top:36px}.md\:sl-px-9{padding-left:36px;padding-right:36px}.md\:sl-py-10{padding-bottom:40px;padding-top:40px}.md\:sl-px-10{padding-left:40px;padding-right:40px}.md\:sl-py-11{padding-bottom:44px;padding-top:44px}.md\:sl-px-11{padding-left:44px;padding-right:44px}.md\:sl-py-12{padding-bottom:48px;padding-top:48px}.md\:sl-px-12{padding-left:48px;padding-right:48px}.md\:sl-py-14{padding-bottom:56px;padding-top:56px}.md\:sl-px-14{padding-left:56px;padding-right:56px}.md\:sl-py-16{padding-bottom:64px;padding-top:64px}.md\:sl-px-16{padding-left:64px;padding-right:64px}.md\:sl-py-20{padding-bottom:80px;padding-top:80px}.md\:sl-px-20{padding-left:80px;padding-right:80px}.md\:sl-py-24{padding-bottom:96px;padding-top:96px}.md\:sl-px-24{padding-left:96px;padding-right:96px}.md\:sl-py-28{padding-bottom:112px;padding-top:112px}.md\:sl-px-28{padding-left:112px;padding-right:112px}.md\:sl-py-32{padding-bottom:128px;padding-top:128px}.md\:sl-px-32{padding-left:128px;padding-right:128px}.md\:sl-py-36{padding-bottom:144px;padding-top:144px}.md\:sl-px-36{padding-left:144px;padding-right:144px}.md\:sl-py-40{padding-bottom:160px;padding-top:160px}.md\:sl-px-40{padding-left:160px;padding-right:160px}.md\:sl-py-44{padding-bottom:176px;padding-top:176px}.md\:sl-px-44{padding-left:176px;padding-right:176px}.md\:sl-py-48{padding-bottom:192px;padding-top:192px}.md\:sl-px-48{padding-left:192px;padding-right:192px}.md\:sl-py-52{padding-bottom:208px;padding-top:208px}.md\:sl-px-52{padding-left:208px;padding-right:208px}.md\:sl-py-56{padding-bottom:224px;padding-top:224px}.md\:sl-px-56{padding-left:224px;padding-right:224px}.md\:sl-py-60{padding-bottom:240px;padding-top:240px}.md\:sl-px-60{padding-left:240px;padding-right:240px}.md\:sl-py-64{padding-bottom:256px;padding-top:256px}.md\:sl-px-64{padding-left:256px;padding-right:256px}.md\:sl-py-72{padding-bottom:288px;padding-top:288px}.md\:sl-px-72{padding-left:288px;padding-right:288px}.md\:sl-py-80{padding-bottom:320px;padding-top:320px}.md\:sl-px-80{padding-left:320px;padding-right:320px}.md\:sl-py-96{padding-bottom:384px;padding-top:384px}.md\:sl-px-96{padding-left:384px;padding-right:384px}.md\:sl-py-px{padding-bottom:1px;padding-top:1px}.md\:sl-px-px{padding-left:1px;padding-right:1px}.md\:sl-py-0\.5{padding-bottom:2px;padding-top:2px}.md\:sl-px-0\.5{padding-left:2px;padding-right:2px}.md\:sl-py-1\.5{padding-bottom:6px;padding-top:6px}.md\:sl-px-1\.5{padding-left:6px;padding-right:6px}.md\:sl-py-2\.5{padding-bottom:10px;padding-top:10px}.md\:sl-px-2\.5{padding-left:10px;padding-right:10px}.md\:sl-py-3\.5{padding-bottom:14px;padding-top:14px}.md\:sl-px-3\.5{padding-left:14px;padding-right:14px}.md\:sl-py-4\.5{padding-bottom:18px;padding-top:18px}.md\:sl-px-4\.5{padding-left:18px;padding-right:18px}.md\:sl-pt-0{padding-top:0}.md\:sl-pr-0{padding-right:0}.md\:sl-pb-0{padding-bottom:0}.md\:sl-pl-0{padding-left:0}.md\:sl-pt-1{padding-top:4px}.md\:sl-pr-1{padding-right:4px}.md\:sl-pb-1{padding-bottom:4px}.md\:sl-pl-1{padding-left:4px}.md\:sl-pt-2{padding-top:8px}.md\:sl-pr-2{padding-right:8px}.md\:sl-pb-2{padding-bottom:8px}.md\:sl-pl-2{padding-left:8px}.md\:sl-pt-3{padding-top:12px}.md\:sl-pr-3{padding-right:12px}.md\:sl-pb-3{padding-bottom:12px}.md\:sl-pl-3{padding-left:12px}.md\:sl-pt-4{padding-top:16px}.md\:sl-pr-4{padding-right:16px}.md\:sl-pb-4{padding-bottom:16px}.md\:sl-pl-4{padding-left:16px}.md\:sl-pt-5{padding-top:20px}.md\:sl-pr-5{padding-right:20px}.md\:sl-pb-5{padding-bottom:20px}.md\:sl-pl-5{padding-left:20px}.md\:sl-pt-6{padding-top:24px}.md\:sl-pr-6{padding-right:24px}.md\:sl-pb-6{padding-bottom:24px}.md\:sl-pl-6{padding-left:24px}.md\:sl-pt-7{padding-top:28px}.md\:sl-pr-7{padding-right:28px}.md\:sl-pb-7{padding-bottom:28px}.md\:sl-pl-7{padding-left:28px}.md\:sl-pt-8{padding-top:32px}.md\:sl-pr-8{padding-right:32px}.md\:sl-pb-8{padding-bottom:32px}.md\:sl-pl-8{padding-left:32px}.md\:sl-pt-9{padding-top:36px}.md\:sl-pr-9{padding-right:36px}.md\:sl-pb-9{padding-bottom:36px}.md\:sl-pl-9{padding-left:36px}.md\:sl-pt-10{padding-top:40px}.md\:sl-pr-10{padding-right:40px}.md\:sl-pb-10{padding-bottom:40px}.md\:sl-pl-10{padding-left:40px}.md\:sl-pt-11{padding-top:44px}.md\:sl-pr-11{padding-right:44px}.md\:sl-pb-11{padding-bottom:44px}.md\:sl-pl-11{padding-left:44px}.md\:sl-pt-12{padding-top:48px}.md\:sl-pr-12{padding-right:48px}.md\:sl-pb-12{padding-bottom:48px}.md\:sl-pl-12{padding-left:48px}.md\:sl-pt-14{padding-top:56px}.md\:sl-pr-14{padding-right:56px}.md\:sl-pb-14{padding-bottom:56px}.md\:sl-pl-14{padding-left:56px}.md\:sl-pt-16{padding-top:64px}.md\:sl-pr-16{padding-right:64px}.md\:sl-pb-16{padding-bottom:64px}.md\:sl-pl-16{padding-left:64px}.md\:sl-pt-20{padding-top:80px}.md\:sl-pr-20{padding-right:80px}.md\:sl-pb-20{padding-bottom:80px}.md\:sl-pl-20{padding-left:80px}.md\:sl-pt-24{padding-top:96px}.md\:sl-pr-24{padding-right:96px}.md\:sl-pb-24{padding-bottom:96px}.md\:sl-pl-24{padding-left:96px}.md\:sl-pt-28{padding-top:112px}.md\:sl-pr-28{padding-right:112px}.md\:sl-pb-28{padding-bottom:112px}.md\:sl-pl-28{padding-left:112px}.md\:sl-pt-32{padding-top:128px}.md\:sl-pr-32{padding-right:128px}.md\:sl-pb-32{padding-bottom:128px}.md\:sl-pl-32{padding-left:128px}.md\:sl-pt-36{padding-top:144px}.md\:sl-pr-36{padding-right:144px}.md\:sl-pb-36{padding-bottom:144px}.md\:sl-pl-36{padding-left:144px}.md\:sl-pt-40{padding-top:160px}.md\:sl-pr-40{padding-right:160px}.md\:sl-pb-40{padding-bottom:160px}.md\:sl-pl-40{padding-left:160px}.md\:sl-pt-44{padding-top:176px}.md\:sl-pr-44{padding-right:176px}.md\:sl-pb-44{padding-bottom:176px}.md\:sl-pl-44{padding-left:176px}.md\:sl-pt-48{padding-top:192px}.md\:sl-pr-48{padding-right:192px}.md\:sl-pb-48{padding-bottom:192px}.md\:sl-pl-48{padding-left:192px}.md\:sl-pt-52{padding-top:208px}.md\:sl-pr-52{padding-right:208px}.md\:sl-pb-52{padding-bottom:208px}.md\:sl-pl-52{padding-left:208px}.md\:sl-pt-56{padding-top:224px}.md\:sl-pr-56{padding-right:224px}.md\:sl-pb-56{padding-bottom:224px}.md\:sl-pl-56{padding-left:224px}.md\:sl-pt-60{padding-top:240px}.md\:sl-pr-60{padding-right:240px}.md\:sl-pb-60{padding-bottom:240px}.md\:sl-pl-60{padding-left:240px}.md\:sl-pt-64{padding-top:256px}.md\:sl-pr-64{padding-right:256px}.md\:sl-pb-64{padding-bottom:256px}.md\:sl-pl-64{padding-left:256px}.md\:sl-pt-72{padding-top:288px}.md\:sl-pr-72{padding-right:288px}.md\:sl-pb-72{padding-bottom:288px}.md\:sl-pl-72{padding-left:288px}.md\:sl-pt-80{padding-top:320px}.md\:sl-pr-80{padding-right:320px}.md\:sl-pb-80{padding-bottom:320px}.md\:sl-pl-80{padding-left:320px}.md\:sl-pt-96{padding-top:384px}.md\:sl-pr-96{padding-right:384px}.md\:sl-pb-96{padding-bottom:384px}.md\:sl-pl-96{padding-left:384px}.md\:sl-pt-px{padding-top:1px}.md\:sl-pr-px{padding-right:1px}.md\:sl-pb-px{padding-bottom:1px}.md\:sl-pl-px{padding-left:1px}.md\:sl-pt-0\.5{padding-top:2px}.md\:sl-pr-0\.5{padding-right:2px}.md\:sl-pb-0\.5{padding-bottom:2px}.md\:sl-pl-0\.5{padding-left:2px}.md\:sl-pt-1\.5{padding-top:6px}.md\:sl-pr-1\.5{padding-right:6px}.md\:sl-pb-1\.5{padding-bottom:6px}.md\:sl-pl-1\.5{padding-left:6px}.md\:sl-pt-2\.5{padding-top:10px}.md\:sl-pr-2\.5{padding-right:10px}.md\:sl-pb-2\.5{padding-bottom:10px}.md\:sl-pl-2\.5{padding-left:10px}.md\:sl-pt-3\.5{padding-top:14px}.md\:sl-pr-3\.5{padding-right:14px}.md\:sl-pb-3\.5{padding-bottom:14px}.md\:sl-pl-3\.5{padding-left:14px}.md\:sl-pt-4\.5{padding-top:18px}.md\:sl-pr-4\.5{padding-right:18px}.md\:sl-pb-4\.5{padding-bottom:18px}.md\:sl-pl-4\.5{padding-left:18px}.md\:sl-static{position:static}.md\:sl-fixed{position:fixed}.md\:sl-absolute{position:absolute}.md\:sl-relative{position:relative}.md\:sl-sticky{position:-webkit-sticky;position:sticky}.md\:sl-visible{visibility:visible}.md\:sl-invisible{visibility:hidden}.sl-group:hover .md\:group-hover\:sl-visible{visibility:visible}.sl-group:hover .md\:group-hover\:sl-invisible{visibility:hidden}.sl-group:focus .md\:group-focus\:sl-visible{visibility:visible}.sl-group:focus .md\:group-focus\:sl-invisible{visibility:hidden}.md\:sl-w-0{width:0}.md\:sl-w-1{width:4px}.md\:sl-w-2{width:8px}.md\:sl-w-3{width:12px}.md\:sl-w-4{width:16px}.md\:sl-w-5{width:20px}.md\:sl-w-6{width:24px}.md\:sl-w-7{width:28px}.md\:sl-w-8{width:32px}.md\:sl-w-9{width:36px}.md\:sl-w-10{width:40px}.md\:sl-w-11{width:44px}.md\:sl-w-12{width:48px}.md\:sl-w-14{width:56px}.md\:sl-w-16{width:64px}.md\:sl-w-20{width:80px}.md\:sl-w-24{width:96px}.md\:sl-w-28{width:112px}.md\:sl-w-32{width:128px}.md\:sl-w-36{width:144px}.md\:sl-w-40{width:160px}.md\:sl-w-44{width:176px}.md\:sl-w-48{width:192px}.md\:sl-w-52{width:208px}.md\:sl-w-56{width:224px}.md\:sl-w-60{width:240px}.md\:sl-w-64{width:256px}.md\:sl-w-72{width:288px}.md\:sl-w-80{width:320px}.md\:sl-w-96{width:384px}.md\:sl-w-auto{width:auto}.md\:sl-w-px{width:1px}.md\:sl-w-0\.5{width:2px}.md\:sl-w-1\.5{width:6px}.md\:sl-w-2\.5{width:10px}.md\:sl-w-3\.5{width:14px}.md\:sl-w-4\.5{width:18px}.md\:sl-w-xs{width:20px}.md\:sl-w-sm{width:24px}.md\:sl-w-md{width:32px}.md\:sl-w-lg{width:36px}.md\:sl-w-xl{width:44px}.md\:sl-w-2xl{width:52px}.md\:sl-w-3xl{width:60px}.md\:sl-w-1\/2{width:50%}.md\:sl-w-1\/3{width:33.333333%}.md\:sl-w-2\/3{width:66.666667%}.md\:sl-w-1\/4{width:25%}.md\:sl-w-2\/4{width:50%}.md\:sl-w-3\/4{width:75%}.md\:sl-w-1\/5{width:20%}.md\:sl-w-2\/5{width:40%}.md\:sl-w-3\/5{width:60%}.md\:sl-w-4\/5{width:80%}.md\:sl-w-1\/6{width:16.666667%}.md\:sl-w-2\/6{width:33.333333%}.md\:sl-w-3\/6{width:50%}.md\:sl-w-4\/6{width:66.666667%}.md\:sl-w-5\/6{width:83.333333%}.md\:sl-w-full{width:100%}.md\:sl-w-screen{width:100vw}.md\:sl-w-min{width:-moz-min-content;width:min-content}.md\:sl-w-max{width:-moz-max-content;width:max-content}}@media (max-width:975px){.sl-stack--horizontal.lg\:sl-stack--1>:not(style)~:not(style){margin-left:4px}.sl-stack--vertical.lg\:sl-stack--1>:not(style)~:not(style){margin-top:4px}.sl-stack--horizontal.lg\:sl-stack--2>:not(style)~:not(style){margin-left:8px}.sl-stack--vertical.lg\:sl-stack--2>:not(style)~:not(style){margin-top:8px}.sl-stack--horizontal.lg\:sl-stack--3>:not(style)~:not(style){margin-left:12px}.sl-stack--vertical.lg\:sl-stack--3>:not(style)~:not(style){margin-top:12px}.sl-stack--horizontal.lg\:sl-stack--4>:not(style)~:not(style){margin-left:16px}.sl-stack--vertical.lg\:sl-stack--4>:not(style)~:not(style){margin-top:16px}.sl-stack--horizontal.lg\:sl-stack--5>:not(style)~:not(style){margin-left:20px}.sl-stack--vertical.lg\:sl-stack--5>:not(style)~:not(style){margin-top:20px}.sl-stack--horizontal.lg\:sl-stack--6>:not(style)~:not(style){margin-left:24px}.sl-stack--vertical.lg\:sl-stack--6>:not(style)~:not(style){margin-top:24px}.sl-stack--horizontal.lg\:sl-stack--7>:not(style)~:not(style){margin-left:28px}.sl-stack--vertical.lg\:sl-stack--7>:not(style)~:not(style){margin-top:28px}.sl-stack--horizontal.lg\:sl-stack--8>:not(style)~:not(style){margin-left:32px}.sl-stack--vertical.lg\:sl-stack--8>:not(style)~:not(style){margin-top:32px}.sl-stack--horizontal.lg\:sl-stack--9>:not(style)~:not(style){margin-left:36px}.sl-stack--vertical.lg\:sl-stack--9>:not(style)~:not(style){margin-top:36px}.sl-stack--horizontal.lg\:sl-stack--10>:not(style)~:not(style){margin-left:40px}.sl-stack--vertical.lg\:sl-stack--10>:not(style)~:not(style){margin-top:40px}.sl-stack--horizontal.lg\:sl-stack--12>:not(style)~:not(style){margin-left:48px}.sl-stack--vertical.lg\:sl-stack--12>:not(style)~:not(style){margin-top:48px}.sl-stack--horizontal.lg\:sl-stack--14>:not(style)~:not(style){margin-left:56px}.sl-stack--vertical.lg\:sl-stack--14>:not(style)~:not(style){margin-top:56px}.sl-stack--horizontal.lg\:sl-stack--16>:not(style)~:not(style){margin-left:64px}.sl-stack--vertical.lg\:sl-stack--16>:not(style)~:not(style){margin-top:64px}.sl-stack--horizontal.lg\:sl-stack--20>:not(style)~:not(style){margin-left:80px}.sl-stack--vertical.lg\:sl-stack--20>:not(style)~:not(style){margin-top:80px}.sl-stack--horizontal.lg\:sl-stack--24>:not(style)~:not(style){margin-left:96px}.sl-stack--vertical.lg\:sl-stack--24>:not(style)~:not(style){margin-top:96px}.sl-stack--horizontal.lg\:sl-stack--32>:not(style)~:not(style){margin-left:128px}.sl-stack--vertical.lg\:sl-stack--32>:not(style)~:not(style){margin-top:128px}.lg\:sl-content-center{align-content:center}.lg\:sl-content-start{align-content:flex-start}.lg\:sl-content-end{align-content:flex-end}.lg\:sl-content-between{align-content:space-between}.lg\:sl-content-around{align-content:space-around}.lg\:sl-content-evenly{align-content:space-evenly}.lg\:sl-items-start{align-items:flex-start}.lg\:sl-items-end{align-items:flex-end}.lg\:sl-items-center{align-items:center}.lg\:sl-items-baseline{align-items:baseline}.lg\:sl-items-stretch{align-items:stretch}.lg\:sl-self-auto{align-self:auto}.lg\:sl-self-start{align-self:flex-start}.lg\:sl-self-end{align-self:flex-end}.lg\:sl-self-center{align-self:center}.lg\:sl-self-stretch{align-self:stretch}.lg\:sl-blur-0,.lg\:sl-blur-none{--tw-blur:blur(0)}.lg\:sl-blur-sm{--tw-blur:blur(4px)}.lg\:sl-blur{--tw-blur:blur(8px)}.lg\:sl-blur-md{--tw-blur:blur(12px)}.lg\:sl-blur-lg{--tw-blur:blur(16px)}.lg\:sl-blur-xl{--tw-blur:blur(24px)}.lg\:sl-blur-2xl{--tw-blur:blur(40px)}.lg\:sl-blur-3xl{--tw-blur:blur(64px)}.lg\:sl-block{display:block}.lg\:sl-inline-block{display:inline-block}.lg\:sl-inline{display:inline}.lg\:sl-flex{display:flex}.lg\:sl-inline-flex{display:inline-flex}.lg\:sl-table{display:table}.lg\:sl-inline-table{display:inline-table}.lg\:sl-table-caption{display:table-caption}.lg\:sl-table-cell{display:table-cell}.lg\:sl-table-column{display:table-column}.lg\:sl-table-column-group{display:table-column-group}.lg\:sl-table-footer-group{display:table-footer-group}.lg\:sl-table-header-group{display:table-header-group}.lg\:sl-table-row-group{display:table-row-group}.lg\:sl-table-row{display:table-row}.lg\:sl-flow-root{display:flow-root}.lg\:sl-grid{display:grid}.lg\:sl-inline-grid{display:inline-grid}.lg\:sl-contents{display:contents}.lg\:sl-list-item{display:list-item}.lg\:sl-hidden{display:none}.lg\:sl-drop-shadow{--tw-drop-shadow:drop-shadow(var(--drop-shadow-default1)) drop-shadow(var(--drop-shadow-default2))}.lg\:sl-flex-1{flex:1 1}.lg\:sl-flex-auto{flex:1 1 auto}.lg\:sl-flex-initial{flex:0 1 auto}.lg\:sl-flex-none{flex:none}.lg\:sl-flex-row{flex-direction:row}.lg\:sl-flex-row-reverse{flex-direction:row-reverse}.lg\:sl-flex-col{flex-direction:column}.lg\:sl-flex-col-reverse{flex-direction:column-reverse}.lg\:sl-flex-grow-0{flex-grow:0}.lg\:sl-flex-grow{flex-grow:1}.lg\:sl-flex-shrink-0{flex-shrink:0}.lg\:sl-flex-shrink{flex-shrink:1}.lg\:sl-flex-wrap{flex-wrap:wrap}.lg\:sl-flex-wrap-reverse{flex-wrap:wrap-reverse}.lg\:sl-flex-nowrap{flex-wrap:nowrap}.lg\:sl-h-0{height:0}.lg\:sl-h-1{height:4px}.lg\:sl-h-2{height:8px}.lg\:sl-h-3{height:12px}.lg\:sl-h-4{height:16px}.lg\:sl-h-5{height:20px}.lg\:sl-h-6{height:24px}.lg\:sl-h-7{height:28px}.lg\:sl-h-8{height:32px}.lg\:sl-h-9{height:36px}.lg\:sl-h-10{height:40px}.lg\:sl-h-11{height:44px}.lg\:sl-h-12{height:48px}.lg\:sl-h-14{height:56px}.lg\:sl-h-16{height:64px}.lg\:sl-h-20{height:80px}.lg\:sl-h-24{height:96px}.lg\:sl-h-28{height:112px}.lg\:sl-h-32{height:128px}.lg\:sl-h-36{height:144px}.lg\:sl-h-40{height:160px}.lg\:sl-h-44{height:176px}.lg\:sl-h-48{height:192px}.lg\:sl-h-52{height:208px}.lg\:sl-h-56{height:224px}.lg\:sl-h-60{height:240px}.lg\:sl-h-64{height:256px}.lg\:sl-h-72{height:288px}.lg\:sl-h-80{height:320px}.lg\:sl-h-96{height:384px}.lg\:sl-h-auto{height:auto}.lg\:sl-h-px{height:1px}.lg\:sl-h-0\.5{height:2px}.lg\:sl-h-1\.5{height:6px}.lg\:sl-h-2\.5{height:10px}.lg\:sl-h-3\.5{height:14px}.lg\:sl-h-4\.5{height:18px}.lg\:sl-h-xs{height:20px}.lg\:sl-h-sm{height:24px}.lg\:sl-h-md{height:32px}.lg\:sl-h-lg{height:36px}.lg\:sl-h-xl{height:44px}.lg\:sl-h-2xl{height:52px}.lg\:sl-h-3xl{height:60px}.lg\:sl-h-full{height:100%}.lg\:sl-h-screen{height:100vh}.lg\:sl-justify-start{justify-content:flex-start}.lg\:sl-justify-end{justify-content:flex-end}.lg\:sl-justify-center{justify-content:center}.lg\:sl-justify-between{justify-content:space-between}.lg\:sl-justify-around{justify-content:space-around}.lg\:sl-justify-evenly{justify-content:space-evenly}.lg\:sl-justify-items-start{justify-items:start}.lg\:sl-justify-items-end{justify-items:end}.lg\:sl-justify-items-center{justify-items:center}.lg\:sl-justify-items-stretch{justify-items:stretch}.lg\:sl-justify-self-auto{justify-self:auto}.lg\:sl-justify-self-start{justify-self:start}.lg\:sl-justify-self-end{justify-self:end}.lg\:sl-justify-self-center{justify-self:center}.lg\:sl-justify-self-stretch{justify-self:stretch}.lg\:sl-m-0{margin:0}.lg\:sl-m-1{margin:4px}.lg\:sl-m-2{margin:8px}.lg\:sl-m-3{margin:12px}.lg\:sl-m-4{margin:16px}.lg\:sl-m-5{margin:20px}.lg\:sl-m-6{margin:24px}.lg\:sl-m-7{margin:28px}.lg\:sl-m-8{margin:32px}.lg\:sl-m-9{margin:36px}.lg\:sl-m-10{margin:40px}.lg\:sl-m-11{margin:44px}.lg\:sl-m-12{margin:48px}.lg\:sl-m-14{margin:56px}.lg\:sl-m-16{margin:64px}.lg\:sl-m-20{margin:80px}.lg\:sl-m-24{margin:96px}.lg\:sl-m-28{margin:112px}.lg\:sl-m-32{margin:128px}.lg\:sl-m-36{margin:144px}.lg\:sl-m-40{margin:160px}.lg\:sl-m-44{margin:176px}.lg\:sl-m-48{margin:192px}.lg\:sl-m-52{margin:208px}.lg\:sl-m-56{margin:224px}.lg\:sl-m-60{margin:240px}.lg\:sl-m-64{margin:256px}.lg\:sl-m-72{margin:288px}.lg\:sl-m-80{margin:320px}.lg\:sl-m-96{margin:384px}.lg\:sl-m-auto{margin:auto}.lg\:sl-m-px{margin:1px}.lg\:sl-m-0\.5{margin:2px}.lg\:sl-m-1\.5{margin:6px}.lg\:sl-m-2\.5{margin:10px}.lg\:sl-m-3\.5{margin:14px}.lg\:sl-m-4\.5{margin:18px}.lg\:sl--m-0{margin:0}.lg\:sl--m-1{margin:-4px}.lg\:sl--m-2{margin:-8px}.lg\:sl--m-3{margin:-12px}.lg\:sl--m-4{margin:-16px}.lg\:sl--m-5{margin:-20px}.lg\:sl--m-6{margin:-24px}.lg\:sl--m-7{margin:-28px}.lg\:sl--m-8{margin:-32px}.lg\:sl--m-9{margin:-36px}.lg\:sl--m-10{margin:-40px}.lg\:sl--m-11{margin:-44px}.lg\:sl--m-12{margin:-48px}.lg\:sl--m-14{margin:-56px}.lg\:sl--m-16{margin:-64px}.lg\:sl--m-20{margin:-80px}.lg\:sl--m-24{margin:-96px}.lg\:sl--m-28{margin:-112px}.lg\:sl--m-32{margin:-128px}.lg\:sl--m-36{margin:-144px}.lg\:sl--m-40{margin:-160px}.lg\:sl--m-44{margin:-176px}.lg\:sl--m-48{margin:-192px}.lg\:sl--m-52{margin:-208px}.lg\:sl--m-56{margin:-224px}.lg\:sl--m-60{margin:-240px}.lg\:sl--m-64{margin:-256px}.lg\:sl--m-72{margin:-288px}.lg\:sl--m-80{margin:-320px}.lg\:sl--m-96{margin:-384px}.lg\:sl--m-px{margin:-1px}.lg\:sl--m-0\.5{margin:-2px}.lg\:sl--m-1\.5{margin:-6px}.lg\:sl--m-2\.5{margin:-10px}.lg\:sl--m-3\.5{margin:-14px}.lg\:sl--m-4\.5{margin:-18px}.lg\:sl-my-0{margin-bottom:0;margin-top:0}.lg\:sl-mx-0{margin-left:0;margin-right:0}.lg\:sl-my-1{margin-bottom:4px;margin-top:4px}.lg\:sl-mx-1{margin-left:4px;margin-right:4px}.lg\:sl-my-2{margin-bottom:8px;margin-top:8px}.lg\:sl-mx-2{margin-left:8px;margin-right:8px}.lg\:sl-my-3{margin-bottom:12px;margin-top:12px}.lg\:sl-mx-3{margin-left:12px;margin-right:12px}.lg\:sl-my-4{margin-bottom:16px;margin-top:16px}.lg\:sl-mx-4{margin-left:16px;margin-right:16px}.lg\:sl-my-5{margin-bottom:20px;margin-top:20px}.lg\:sl-mx-5{margin-left:20px;margin-right:20px}.lg\:sl-my-6{margin-bottom:24px;margin-top:24px}.lg\:sl-mx-6{margin-left:24px;margin-right:24px}.lg\:sl-my-7{margin-bottom:28px;margin-top:28px}.lg\:sl-mx-7{margin-left:28px;margin-right:28px}.lg\:sl-my-8{margin-bottom:32px;margin-top:32px}.lg\:sl-mx-8{margin-left:32px;margin-right:32px}.lg\:sl-my-9{margin-bottom:36px;margin-top:36px}.lg\:sl-mx-9{margin-left:36px;margin-right:36px}.lg\:sl-my-10{margin-bottom:40px;margin-top:40px}.lg\:sl-mx-10{margin-left:40px;margin-right:40px}.lg\:sl-my-11{margin-bottom:44px;margin-top:44px}.lg\:sl-mx-11{margin-left:44px;margin-right:44px}.lg\:sl-my-12{margin-bottom:48px;margin-top:48px}.lg\:sl-mx-12{margin-left:48px;margin-right:48px}.lg\:sl-my-14{margin-bottom:56px;margin-top:56px}.lg\:sl-mx-14{margin-left:56px;margin-right:56px}.lg\:sl-my-16{margin-bottom:64px;margin-top:64px}.lg\:sl-mx-16{margin-left:64px;margin-right:64px}.lg\:sl-my-20{margin-bottom:80px;margin-top:80px}.lg\:sl-mx-20{margin-left:80px;margin-right:80px}.lg\:sl-my-24{margin-bottom:96px;margin-top:96px}.lg\:sl-mx-24{margin-left:96px;margin-right:96px}.lg\:sl-my-28{margin-bottom:112px;margin-top:112px}.lg\:sl-mx-28{margin-left:112px;margin-right:112px}.lg\:sl-my-32{margin-bottom:128px;margin-top:128px}.lg\:sl-mx-32{margin-left:128px;margin-right:128px}.lg\:sl-my-36{margin-bottom:144px;margin-top:144px}.lg\:sl-mx-36{margin-left:144px;margin-right:144px}.lg\:sl-my-40{margin-bottom:160px;margin-top:160px}.lg\:sl-mx-40{margin-left:160px;margin-right:160px}.lg\:sl-my-44{margin-bottom:176px;margin-top:176px}.lg\:sl-mx-44{margin-left:176px;margin-right:176px}.lg\:sl-my-48{margin-bottom:192px;margin-top:192px}.lg\:sl-mx-48{margin-left:192px;margin-right:192px}.lg\:sl-my-52{margin-bottom:208px;margin-top:208px}.lg\:sl-mx-52{margin-left:208px;margin-right:208px}.lg\:sl-my-56{margin-bottom:224px;margin-top:224px}.lg\:sl-mx-56{margin-left:224px;margin-right:224px}.lg\:sl-my-60{margin-bottom:240px;margin-top:240px}.lg\:sl-mx-60{margin-left:240px;margin-right:240px}.lg\:sl-my-64{margin-bottom:256px;margin-top:256px}.lg\:sl-mx-64{margin-left:256px;margin-right:256px}.lg\:sl-my-72{margin-bottom:288px;margin-top:288px}.lg\:sl-mx-72{margin-left:288px;margin-right:288px}.lg\:sl-my-80{margin-bottom:320px;margin-top:320px}.lg\:sl-mx-80{margin-left:320px;margin-right:320px}.lg\:sl-my-96{margin-bottom:384px;margin-top:384px}.lg\:sl-mx-96{margin-left:384px;margin-right:384px}.lg\:sl-my-auto{margin-bottom:auto;margin-top:auto}.lg\:sl-mx-auto{margin-left:auto;margin-right:auto}.lg\:sl-my-px{margin-bottom:1px;margin-top:1px}.lg\:sl-mx-px{margin-left:1px;margin-right:1px}.lg\:sl-my-0\.5{margin-bottom:2px;margin-top:2px}.lg\:sl-mx-0\.5{margin-left:2px;margin-right:2px}.lg\:sl-my-1\.5{margin-bottom:6px;margin-top:6px}.lg\:sl-mx-1\.5{margin-left:6px;margin-right:6px}.lg\:sl-my-2\.5{margin-bottom:10px;margin-top:10px}.lg\:sl-mx-2\.5{margin-left:10px;margin-right:10px}.lg\:sl-my-3\.5{margin-bottom:14px;margin-top:14px}.lg\:sl-mx-3\.5{margin-left:14px;margin-right:14px}.lg\:sl-my-4\.5{margin-bottom:18px;margin-top:18px}.lg\:sl-mx-4\.5{margin-left:18px;margin-right:18px}.lg\:sl--my-0{margin-bottom:0;margin-top:0}.lg\:sl--mx-0{margin-left:0;margin-right:0}.lg\:sl--my-1{margin-bottom:-4px;margin-top:-4px}.lg\:sl--mx-1{margin-left:-4px;margin-right:-4px}.lg\:sl--my-2{margin-bottom:-8px;margin-top:-8px}.lg\:sl--mx-2{margin-left:-8px;margin-right:-8px}.lg\:sl--my-3{margin-bottom:-12px;margin-top:-12px}.lg\:sl--mx-3{margin-left:-12px;margin-right:-12px}.lg\:sl--my-4{margin-bottom:-16px;margin-top:-16px}.lg\:sl--mx-4{margin-left:-16px;margin-right:-16px}.lg\:sl--my-5{margin-bottom:-20px;margin-top:-20px}.lg\:sl--mx-5{margin-left:-20px;margin-right:-20px}.lg\:sl--my-6{margin-bottom:-24px;margin-top:-24px}.lg\:sl--mx-6{margin-left:-24px;margin-right:-24px}.lg\:sl--my-7{margin-bottom:-28px;margin-top:-28px}.lg\:sl--mx-7{margin-left:-28px;margin-right:-28px}.lg\:sl--my-8{margin-bottom:-32px;margin-top:-32px}.lg\:sl--mx-8{margin-left:-32px;margin-right:-32px}.lg\:sl--my-9{margin-bottom:-36px;margin-top:-36px}.lg\:sl--mx-9{margin-left:-36px;margin-right:-36px}.lg\:sl--my-10{margin-bottom:-40px;margin-top:-40px}.lg\:sl--mx-10{margin-left:-40px;margin-right:-40px}.lg\:sl--my-11{margin-bottom:-44px;margin-top:-44px}.lg\:sl--mx-11{margin-left:-44px;margin-right:-44px}.lg\:sl--my-12{margin-bottom:-48px;margin-top:-48px}.lg\:sl--mx-12{margin-left:-48px;margin-right:-48px}.lg\:sl--my-14{margin-bottom:-56px;margin-top:-56px}.lg\:sl--mx-14{margin-left:-56px;margin-right:-56px}.lg\:sl--my-16{margin-bottom:-64px;margin-top:-64px}.lg\:sl--mx-16{margin-left:-64px;margin-right:-64px}.lg\:sl--my-20{margin-bottom:-80px;margin-top:-80px}.lg\:sl--mx-20{margin-left:-80px;margin-right:-80px}.lg\:sl--my-24{margin-bottom:-96px;margin-top:-96px}.lg\:sl--mx-24{margin-left:-96px;margin-right:-96px}.lg\:sl--my-28{margin-bottom:-112px;margin-top:-112px}.lg\:sl--mx-28{margin-left:-112px;margin-right:-112px}.lg\:sl--my-32{margin-bottom:-128px;margin-top:-128px}.lg\:sl--mx-32{margin-left:-128px;margin-right:-128px}.lg\:sl--my-36{margin-bottom:-144px;margin-top:-144px}.lg\:sl--mx-36{margin-left:-144px;margin-right:-144px}.lg\:sl--my-40{margin-bottom:-160px;margin-top:-160px}.lg\:sl--mx-40{margin-left:-160px;margin-right:-160px}.lg\:sl--my-44{margin-bottom:-176px;margin-top:-176px}.lg\:sl--mx-44{margin-left:-176px;margin-right:-176px}.lg\:sl--my-48{margin-bottom:-192px;margin-top:-192px}.lg\:sl--mx-48{margin-left:-192px;margin-right:-192px}.lg\:sl--my-52{margin-bottom:-208px;margin-top:-208px}.lg\:sl--mx-52{margin-left:-208px;margin-right:-208px}.lg\:sl--my-56{margin-bottom:-224px;margin-top:-224px}.lg\:sl--mx-56{margin-left:-224px;margin-right:-224px}.lg\:sl--my-60{margin-bottom:-240px;margin-top:-240px}.lg\:sl--mx-60{margin-left:-240px;margin-right:-240px}.lg\:sl--my-64{margin-bottom:-256px;margin-top:-256px}.lg\:sl--mx-64{margin-left:-256px;margin-right:-256px}.lg\:sl--my-72{margin-bottom:-288px;margin-top:-288px}.lg\:sl--mx-72{margin-left:-288px;margin-right:-288px}.lg\:sl--my-80{margin-bottom:-320px;margin-top:-320px}.lg\:sl--mx-80{margin-left:-320px;margin-right:-320px}.lg\:sl--my-96{margin-bottom:-384px;margin-top:-384px}.lg\:sl--mx-96{margin-left:-384px;margin-right:-384px}.lg\:sl--my-px{margin-bottom:-1px;margin-top:-1px}.lg\:sl--mx-px{margin-left:-1px;margin-right:-1px}.lg\:sl--my-0\.5{margin-bottom:-2px;margin-top:-2px}.lg\:sl--mx-0\.5{margin-left:-2px;margin-right:-2px}.lg\:sl--my-1\.5{margin-bottom:-6px;margin-top:-6px}.lg\:sl--mx-1\.5{margin-left:-6px;margin-right:-6px}.lg\:sl--my-2\.5{margin-bottom:-10px;margin-top:-10px}.lg\:sl--mx-2\.5{margin-left:-10px;margin-right:-10px}.lg\:sl--my-3\.5{margin-bottom:-14px;margin-top:-14px}.lg\:sl--mx-3\.5{margin-left:-14px;margin-right:-14px}.lg\:sl--my-4\.5{margin-bottom:-18px;margin-top:-18px}.lg\:sl--mx-4\.5{margin-left:-18px;margin-right:-18px}.lg\:sl-mt-0{margin-top:0}.lg\:sl-mr-0{margin-right:0}.lg\:sl-mb-0{margin-bottom:0}.lg\:sl-ml-0{margin-left:0}.lg\:sl-mt-1{margin-top:4px}.lg\:sl-mr-1{margin-right:4px}.lg\:sl-mb-1{margin-bottom:4px}.lg\:sl-ml-1{margin-left:4px}.lg\:sl-mt-2{margin-top:8px}.lg\:sl-mr-2{margin-right:8px}.lg\:sl-mb-2{margin-bottom:8px}.lg\:sl-ml-2{margin-left:8px}.lg\:sl-mt-3{margin-top:12px}.lg\:sl-mr-3{margin-right:12px}.lg\:sl-mb-3{margin-bottom:12px}.lg\:sl-ml-3{margin-left:12px}.lg\:sl-mt-4{margin-top:16px}.lg\:sl-mr-4{margin-right:16px}.lg\:sl-mb-4{margin-bottom:16px}.lg\:sl-ml-4{margin-left:16px}.lg\:sl-mt-5{margin-top:20px}.lg\:sl-mr-5{margin-right:20px}.lg\:sl-mb-5{margin-bottom:20px}.lg\:sl-ml-5{margin-left:20px}.lg\:sl-mt-6{margin-top:24px}.lg\:sl-mr-6{margin-right:24px}.lg\:sl-mb-6{margin-bottom:24px}.lg\:sl-ml-6{margin-left:24px}.lg\:sl-mt-7{margin-top:28px}.lg\:sl-mr-7{margin-right:28px}.lg\:sl-mb-7{margin-bottom:28px}.lg\:sl-ml-7{margin-left:28px}.lg\:sl-mt-8{margin-top:32px}.lg\:sl-mr-8{margin-right:32px}.lg\:sl-mb-8{margin-bottom:32px}.lg\:sl-ml-8{margin-left:32px}.lg\:sl-mt-9{margin-top:36px}.lg\:sl-mr-9{margin-right:36px}.lg\:sl-mb-9{margin-bottom:36px}.lg\:sl-ml-9{margin-left:36px}.lg\:sl-mt-10{margin-top:40px}.lg\:sl-mr-10{margin-right:40px}.lg\:sl-mb-10{margin-bottom:40px}.lg\:sl-ml-10{margin-left:40px}.lg\:sl-mt-11{margin-top:44px}.lg\:sl-mr-11{margin-right:44px}.lg\:sl-mb-11{margin-bottom:44px}.lg\:sl-ml-11{margin-left:44px}.lg\:sl-mt-12{margin-top:48px}.lg\:sl-mr-12{margin-right:48px}.lg\:sl-mb-12{margin-bottom:48px}.lg\:sl-ml-12{margin-left:48px}.lg\:sl-mt-14{margin-top:56px}.lg\:sl-mr-14{margin-right:56px}.lg\:sl-mb-14{margin-bottom:56px}.lg\:sl-ml-14{margin-left:56px}.lg\:sl-mt-16{margin-top:64px}.lg\:sl-mr-16{margin-right:64px}.lg\:sl-mb-16{margin-bottom:64px}.lg\:sl-ml-16{margin-left:64px}.lg\:sl-mt-20{margin-top:80px}.lg\:sl-mr-20{margin-right:80px}.lg\:sl-mb-20{margin-bottom:80px}.lg\:sl-ml-20{margin-left:80px}.lg\:sl-mt-24{margin-top:96px}.lg\:sl-mr-24{margin-right:96px}.lg\:sl-mb-24{margin-bottom:96px}.lg\:sl-ml-24{margin-left:96px}.lg\:sl-mt-28{margin-top:112px}.lg\:sl-mr-28{margin-right:112px}.lg\:sl-mb-28{margin-bottom:112px}.lg\:sl-ml-28{margin-left:112px}.lg\:sl-mt-32{margin-top:128px}.lg\:sl-mr-32{margin-right:128px}.lg\:sl-mb-32{margin-bottom:128px}.lg\:sl-ml-32{margin-left:128px}.lg\:sl-mt-36{margin-top:144px}.lg\:sl-mr-36{margin-right:144px}.lg\:sl-mb-36{margin-bottom:144px}.lg\:sl-ml-36{margin-left:144px}.lg\:sl-mt-40{margin-top:160px}.lg\:sl-mr-40{margin-right:160px}.lg\:sl-mb-40{margin-bottom:160px}.lg\:sl-ml-40{margin-left:160px}.lg\:sl-mt-44{margin-top:176px}.lg\:sl-mr-44{margin-right:176px}.lg\:sl-mb-44{margin-bottom:176px}.lg\:sl-ml-44{margin-left:176px}.lg\:sl-mt-48{margin-top:192px}.lg\:sl-mr-48{margin-right:192px}.lg\:sl-mb-48{margin-bottom:192px}.lg\:sl-ml-48{margin-left:192px}.lg\:sl-mt-52{margin-top:208px}.lg\:sl-mr-52{margin-right:208px}.lg\:sl-mb-52{margin-bottom:208px}.lg\:sl-ml-52{margin-left:208px}.lg\:sl-mt-56{margin-top:224px}.lg\:sl-mr-56{margin-right:224px}.lg\:sl-mb-56{margin-bottom:224px}.lg\:sl-ml-56{margin-left:224px}.lg\:sl-mt-60{margin-top:240px}.lg\:sl-mr-60{margin-right:240px}.lg\:sl-mb-60{margin-bottom:240px}.lg\:sl-ml-60{margin-left:240px}.lg\:sl-mt-64{margin-top:256px}.lg\:sl-mr-64{margin-right:256px}.lg\:sl-mb-64{margin-bottom:256px}.lg\:sl-ml-64{margin-left:256px}.lg\:sl-mt-72{margin-top:288px}.lg\:sl-mr-72{margin-right:288px}.lg\:sl-mb-72{margin-bottom:288px}.lg\:sl-ml-72{margin-left:288px}.lg\:sl-mt-80{margin-top:320px}.lg\:sl-mr-80{margin-right:320px}.lg\:sl-mb-80{margin-bottom:320px}.lg\:sl-ml-80{margin-left:320px}.lg\:sl-mt-96{margin-top:384px}.lg\:sl-mr-96{margin-right:384px}.lg\:sl-mb-96{margin-bottom:384px}.lg\:sl-ml-96{margin-left:384px}.lg\:sl-mt-auto{margin-top:auto}.lg\:sl-mr-auto{margin-right:auto}.lg\:sl-mb-auto{margin-bottom:auto}.lg\:sl-ml-auto{margin-left:auto}.lg\:sl-mt-px{margin-top:1px}.lg\:sl-mr-px{margin-right:1px}.lg\:sl-mb-px{margin-bottom:1px}.lg\:sl-ml-px{margin-left:1px}.lg\:sl-mt-0\.5{margin-top:2px}.lg\:sl-mr-0\.5{margin-right:2px}.lg\:sl-mb-0\.5{margin-bottom:2px}.lg\:sl-ml-0\.5{margin-left:2px}.lg\:sl-mt-1\.5{margin-top:6px}.lg\:sl-mr-1\.5{margin-right:6px}.lg\:sl-mb-1\.5{margin-bottom:6px}.lg\:sl-ml-1\.5{margin-left:6px}.lg\:sl-mt-2\.5{margin-top:10px}.lg\:sl-mr-2\.5{margin-right:10px}.lg\:sl-mb-2\.5{margin-bottom:10px}.lg\:sl-ml-2\.5{margin-left:10px}.lg\:sl-mt-3\.5{margin-top:14px}.lg\:sl-mr-3\.5{margin-right:14px}.lg\:sl-mb-3\.5{margin-bottom:14px}.lg\:sl-ml-3\.5{margin-left:14px}.lg\:sl-mt-4\.5{margin-top:18px}.lg\:sl-mr-4\.5{margin-right:18px}.lg\:sl-mb-4\.5{margin-bottom:18px}.lg\:sl-ml-4\.5{margin-left:18px}.lg\:sl--mt-0{margin-top:0}.lg\:sl--mr-0{margin-right:0}.lg\:sl--mb-0{margin-bottom:0}.lg\:sl--ml-0{margin-left:0}.lg\:sl--mt-1{margin-top:-4px}.lg\:sl--mr-1{margin-right:-4px}.lg\:sl--mb-1{margin-bottom:-4px}.lg\:sl--ml-1{margin-left:-4px}.lg\:sl--mt-2{margin-top:-8px}.lg\:sl--mr-2{margin-right:-8px}.lg\:sl--mb-2{margin-bottom:-8px}.lg\:sl--ml-2{margin-left:-8px}.lg\:sl--mt-3{margin-top:-12px}.lg\:sl--mr-3{margin-right:-12px}.lg\:sl--mb-3{margin-bottom:-12px}.lg\:sl--ml-3{margin-left:-12px}.lg\:sl--mt-4{margin-top:-16px}.lg\:sl--mr-4{margin-right:-16px}.lg\:sl--mb-4{margin-bottom:-16px}.lg\:sl--ml-4{margin-left:-16px}.lg\:sl--mt-5{margin-top:-20px}.lg\:sl--mr-5{margin-right:-20px}.lg\:sl--mb-5{margin-bottom:-20px}.lg\:sl--ml-5{margin-left:-20px}.lg\:sl--mt-6{margin-top:-24px}.lg\:sl--mr-6{margin-right:-24px}.lg\:sl--mb-6{margin-bottom:-24px}.lg\:sl--ml-6{margin-left:-24px}.lg\:sl--mt-7{margin-top:-28px}.lg\:sl--mr-7{margin-right:-28px}.lg\:sl--mb-7{margin-bottom:-28px}.lg\:sl--ml-7{margin-left:-28px}.lg\:sl--mt-8{margin-top:-32px}.lg\:sl--mr-8{margin-right:-32px}.lg\:sl--mb-8{margin-bottom:-32px}.lg\:sl--ml-8{margin-left:-32px}.lg\:sl--mt-9{margin-top:-36px}.lg\:sl--mr-9{margin-right:-36px}.lg\:sl--mb-9{margin-bottom:-36px}.lg\:sl--ml-9{margin-left:-36px}.lg\:sl--mt-10{margin-top:-40px}.lg\:sl--mr-10{margin-right:-40px}.lg\:sl--mb-10{margin-bottom:-40px}.lg\:sl--ml-10{margin-left:-40px}.lg\:sl--mt-11{margin-top:-44px}.lg\:sl--mr-11{margin-right:-44px}.lg\:sl--mb-11{margin-bottom:-44px}.lg\:sl--ml-11{margin-left:-44px}.lg\:sl--mt-12{margin-top:-48px}.lg\:sl--mr-12{margin-right:-48px}.lg\:sl--mb-12{margin-bottom:-48px}.lg\:sl--ml-12{margin-left:-48px}.lg\:sl--mt-14{margin-top:-56px}.lg\:sl--mr-14{margin-right:-56px}.lg\:sl--mb-14{margin-bottom:-56px}.lg\:sl--ml-14{margin-left:-56px}.lg\:sl--mt-16{margin-top:-64px}.lg\:sl--mr-16{margin-right:-64px}.lg\:sl--mb-16{margin-bottom:-64px}.lg\:sl--ml-16{margin-left:-64px}.lg\:sl--mt-20{margin-top:-80px}.lg\:sl--mr-20{margin-right:-80px}.lg\:sl--mb-20{margin-bottom:-80px}.lg\:sl--ml-20{margin-left:-80px}.lg\:sl--mt-24{margin-top:-96px}.lg\:sl--mr-24{margin-right:-96px}.lg\:sl--mb-24{margin-bottom:-96px}.lg\:sl--ml-24{margin-left:-96px}.lg\:sl--mt-28{margin-top:-112px}.lg\:sl--mr-28{margin-right:-112px}.lg\:sl--mb-28{margin-bottom:-112px}.lg\:sl--ml-28{margin-left:-112px}.lg\:sl--mt-32{margin-top:-128px}.lg\:sl--mr-32{margin-right:-128px}.lg\:sl--mb-32{margin-bottom:-128px}.lg\:sl--ml-32{margin-left:-128px}.lg\:sl--mt-36{margin-top:-144px}.lg\:sl--mr-36{margin-right:-144px}.lg\:sl--mb-36{margin-bottom:-144px}.lg\:sl--ml-36{margin-left:-144px}.lg\:sl--mt-40{margin-top:-160px}.lg\:sl--mr-40{margin-right:-160px}.lg\:sl--mb-40{margin-bottom:-160px}.lg\:sl--ml-40{margin-left:-160px}.lg\:sl--mt-44{margin-top:-176px}.lg\:sl--mr-44{margin-right:-176px}.lg\:sl--mb-44{margin-bottom:-176px}.lg\:sl--ml-44{margin-left:-176px}.lg\:sl--mt-48{margin-top:-192px}.lg\:sl--mr-48{margin-right:-192px}.lg\:sl--mb-48{margin-bottom:-192px}.lg\:sl--ml-48{margin-left:-192px}.lg\:sl--mt-52{margin-top:-208px}.lg\:sl--mr-52{margin-right:-208px}.lg\:sl--mb-52{margin-bottom:-208px}.lg\:sl--ml-52{margin-left:-208px}.lg\:sl--mt-56{margin-top:-224px}.lg\:sl--mr-56{margin-right:-224px}.lg\:sl--mb-56{margin-bottom:-224px}.lg\:sl--ml-56{margin-left:-224px}.lg\:sl--mt-60{margin-top:-240px}.lg\:sl--mr-60{margin-right:-240px}.lg\:sl--mb-60{margin-bottom:-240px}.lg\:sl--ml-60{margin-left:-240px}.lg\:sl--mt-64{margin-top:-256px}.lg\:sl--mr-64{margin-right:-256px}.lg\:sl--mb-64{margin-bottom:-256px}.lg\:sl--ml-64{margin-left:-256px}.lg\:sl--mt-72{margin-top:-288px}.lg\:sl--mr-72{margin-right:-288px}.lg\:sl--mb-72{margin-bottom:-288px}.lg\:sl--ml-72{margin-left:-288px}.lg\:sl--mt-80{margin-top:-320px}.lg\:sl--mr-80{margin-right:-320px}.lg\:sl--mb-80{margin-bottom:-320px}.lg\:sl--ml-80{margin-left:-320px}.lg\:sl--mt-96{margin-top:-384px}.lg\:sl--mr-96{margin-right:-384px}.lg\:sl--mb-96{margin-bottom:-384px}.lg\:sl--ml-96{margin-left:-384px}.lg\:sl--mt-px{margin-top:-1px}.lg\:sl--mr-px{margin-right:-1px}.lg\:sl--mb-px{margin-bottom:-1px}.lg\:sl--ml-px{margin-left:-1px}.lg\:sl--mt-0\.5{margin-top:-2px}.lg\:sl--mr-0\.5{margin-right:-2px}.lg\:sl--mb-0\.5{margin-bottom:-2px}.lg\:sl--ml-0\.5{margin-left:-2px}.lg\:sl--mt-1\.5{margin-top:-6px}.lg\:sl--mr-1\.5{margin-right:-6px}.lg\:sl--mb-1\.5{margin-bottom:-6px}.lg\:sl--ml-1\.5{margin-left:-6px}.lg\:sl--mt-2\.5{margin-top:-10px}.lg\:sl--mr-2\.5{margin-right:-10px}.lg\:sl--mb-2\.5{margin-bottom:-10px}.lg\:sl--ml-2\.5{margin-left:-10px}.lg\:sl--mt-3\.5{margin-top:-14px}.lg\:sl--mr-3\.5{margin-right:-14px}.lg\:sl--mb-3\.5{margin-bottom:-14px}.lg\:sl--ml-3\.5{margin-left:-14px}.lg\:sl--mt-4\.5{margin-top:-18px}.lg\:sl--mr-4\.5{margin-right:-18px}.lg\:sl--mb-4\.5{margin-bottom:-18px}.lg\:sl--ml-4\.5{margin-left:-18px}.lg\:sl-max-h-full{max-height:100%}.lg\:sl-max-h-screen{max-height:100vh}.lg\:sl-max-w-none{max-width:none}.lg\:sl-max-w-full{max-width:100%}.lg\:sl-max-w-min{max-width:-moz-min-content;max-width:min-content}.lg\:sl-max-w-max{max-width:-moz-max-content;max-width:max-content}.lg\:sl-max-w-prose{max-width:65ch}.lg\:sl-min-h-full{min-height:100%}.lg\:sl-min-h-screen{min-height:100vh}.lg\:sl-min-w-full{min-width:100%}.lg\:sl-min-w-min{min-width:-moz-min-content;min-width:min-content}.lg\:sl-min-w-max{min-width:-moz-max-content;min-width:max-content}.lg\:sl-p-0{padding:0}.lg\:sl-p-1{padding:4px}.lg\:sl-p-2{padding:8px}.lg\:sl-p-3{padding:12px}.lg\:sl-p-4{padding:16px}.lg\:sl-p-5{padding:20px}.lg\:sl-p-6{padding:24px}.lg\:sl-p-7{padding:28px}.lg\:sl-p-8{padding:32px}.lg\:sl-p-9{padding:36px}.lg\:sl-p-10{padding:40px}.lg\:sl-p-11{padding:44px}.lg\:sl-p-12{padding:48px}.lg\:sl-p-14{padding:56px}.lg\:sl-p-16{padding:64px}.lg\:sl-p-20{padding:80px}.lg\:sl-p-24{padding:96px}.lg\:sl-p-28{padding:112px}.lg\:sl-p-32{padding:128px}.lg\:sl-p-36{padding:144px}.lg\:sl-p-40{padding:160px}.lg\:sl-p-44{padding:176px}.lg\:sl-p-48{padding:192px}.lg\:sl-p-52{padding:208px}.lg\:sl-p-56{padding:224px}.lg\:sl-p-60{padding:240px}.lg\:sl-p-64{padding:256px}.lg\:sl-p-72{padding:288px}.lg\:sl-p-80{padding:320px}.lg\:sl-p-96{padding:384px}.lg\:sl-p-px{padding:1px}.lg\:sl-p-0\.5{padding:2px}.lg\:sl-p-1\.5{padding:6px}.lg\:sl-p-2\.5{padding:10px}.lg\:sl-p-3\.5{padding:14px}.lg\:sl-p-4\.5{padding:18px}.lg\:sl-py-0{padding-bottom:0;padding-top:0}.lg\:sl-px-0{padding-left:0;padding-right:0}.lg\:sl-py-1{padding-bottom:4px;padding-top:4px}.lg\:sl-px-1{padding-left:4px;padding-right:4px}.lg\:sl-py-2{padding-bottom:8px;padding-top:8px}.lg\:sl-px-2{padding-left:8px;padding-right:8px}.lg\:sl-py-3{padding-bottom:12px;padding-top:12px}.lg\:sl-px-3{padding-left:12px;padding-right:12px}.lg\:sl-py-4{padding-bottom:16px;padding-top:16px}.lg\:sl-px-4{padding-left:16px;padding-right:16px}.lg\:sl-py-5{padding-bottom:20px;padding-top:20px}.lg\:sl-px-5{padding-left:20px;padding-right:20px}.lg\:sl-py-6{padding-bottom:24px;padding-top:24px}.lg\:sl-px-6{padding-left:24px;padding-right:24px}.lg\:sl-py-7{padding-bottom:28px;padding-top:28px}.lg\:sl-px-7{padding-left:28px;padding-right:28px}.lg\:sl-py-8{padding-bottom:32px;padding-top:32px}.lg\:sl-px-8{padding-left:32px;padding-right:32px}.lg\:sl-py-9{padding-bottom:36px;padding-top:36px}.lg\:sl-px-9{padding-left:36px;padding-right:36px}.lg\:sl-py-10{padding-bottom:40px;padding-top:40px}.lg\:sl-px-10{padding-left:40px;padding-right:40px}.lg\:sl-py-11{padding-bottom:44px;padding-top:44px}.lg\:sl-px-11{padding-left:44px;padding-right:44px}.lg\:sl-py-12{padding-bottom:48px;padding-top:48px}.lg\:sl-px-12{padding-left:48px;padding-right:48px}.lg\:sl-py-14{padding-bottom:56px;padding-top:56px}.lg\:sl-px-14{padding-left:56px;padding-right:56px}.lg\:sl-py-16{padding-bottom:64px;padding-top:64px}.lg\:sl-px-16{padding-left:64px;padding-right:64px}.lg\:sl-py-20{padding-bottom:80px;padding-top:80px}.lg\:sl-px-20{padding-left:80px;padding-right:80px}.lg\:sl-py-24{padding-bottom:96px;padding-top:96px}.lg\:sl-px-24{padding-left:96px;padding-right:96px}.lg\:sl-py-28{padding-bottom:112px;padding-top:112px}.lg\:sl-px-28{padding-left:112px;padding-right:112px}.lg\:sl-py-32{padding-bottom:128px;padding-top:128px}.lg\:sl-px-32{padding-left:128px;padding-right:128px}.lg\:sl-py-36{padding-bottom:144px;padding-top:144px}.lg\:sl-px-36{padding-left:144px;padding-right:144px}.lg\:sl-py-40{padding-bottom:160px;padding-top:160px}.lg\:sl-px-40{padding-left:160px;padding-right:160px}.lg\:sl-py-44{padding-bottom:176px;padding-top:176px}.lg\:sl-px-44{padding-left:176px;padding-right:176px}.lg\:sl-py-48{padding-bottom:192px;padding-top:192px}.lg\:sl-px-48{padding-left:192px;padding-right:192px}.lg\:sl-py-52{padding-bottom:208px;padding-top:208px}.lg\:sl-px-52{padding-left:208px;padding-right:208px}.lg\:sl-py-56{padding-bottom:224px;padding-top:224px}.lg\:sl-px-56{padding-left:224px;padding-right:224px}.lg\:sl-py-60{padding-bottom:240px;padding-top:240px}.lg\:sl-px-60{padding-left:240px;padding-right:240px}.lg\:sl-py-64{padding-bottom:256px;padding-top:256px}.lg\:sl-px-64{padding-left:256px;padding-right:256px}.lg\:sl-py-72{padding-bottom:288px;padding-top:288px}.lg\:sl-px-72{padding-left:288px;padding-right:288px}.lg\:sl-py-80{padding-bottom:320px;padding-top:320px}.lg\:sl-px-80{padding-left:320px;padding-right:320px}.lg\:sl-py-96{padding-bottom:384px;padding-top:384px}.lg\:sl-px-96{padding-left:384px;padding-right:384px}.lg\:sl-py-px{padding-bottom:1px;padding-top:1px}.lg\:sl-px-px{padding-left:1px;padding-right:1px}.lg\:sl-py-0\.5{padding-bottom:2px;padding-top:2px}.lg\:sl-px-0\.5{padding-left:2px;padding-right:2px}.lg\:sl-py-1\.5{padding-bottom:6px;padding-top:6px}.lg\:sl-px-1\.5{padding-left:6px;padding-right:6px}.lg\:sl-py-2\.5{padding-bottom:10px;padding-top:10px}.lg\:sl-px-2\.5{padding-left:10px;padding-right:10px}.lg\:sl-py-3\.5{padding-bottom:14px;padding-top:14px}.lg\:sl-px-3\.5{padding-left:14px;padding-right:14px}.lg\:sl-py-4\.5{padding-bottom:18px;padding-top:18px}.lg\:sl-px-4\.5{padding-left:18px;padding-right:18px}.lg\:sl-pt-0{padding-top:0}.lg\:sl-pr-0{padding-right:0}.lg\:sl-pb-0{padding-bottom:0}.lg\:sl-pl-0{padding-left:0}.lg\:sl-pt-1{padding-top:4px}.lg\:sl-pr-1{padding-right:4px}.lg\:sl-pb-1{padding-bottom:4px}.lg\:sl-pl-1{padding-left:4px}.lg\:sl-pt-2{padding-top:8px}.lg\:sl-pr-2{padding-right:8px}.lg\:sl-pb-2{padding-bottom:8px}.lg\:sl-pl-2{padding-left:8px}.lg\:sl-pt-3{padding-top:12px}.lg\:sl-pr-3{padding-right:12px}.lg\:sl-pb-3{padding-bottom:12px}.lg\:sl-pl-3{padding-left:12px}.lg\:sl-pt-4{padding-top:16px}.lg\:sl-pr-4{padding-right:16px}.lg\:sl-pb-4{padding-bottom:16px}.lg\:sl-pl-4{padding-left:16px}.lg\:sl-pt-5{padding-top:20px}.lg\:sl-pr-5{padding-right:20px}.lg\:sl-pb-5{padding-bottom:20px}.lg\:sl-pl-5{padding-left:20px}.lg\:sl-pt-6{padding-top:24px}.lg\:sl-pr-6{padding-right:24px}.lg\:sl-pb-6{padding-bottom:24px}.lg\:sl-pl-6{padding-left:24px}.lg\:sl-pt-7{padding-top:28px}.lg\:sl-pr-7{padding-right:28px}.lg\:sl-pb-7{padding-bottom:28px}.lg\:sl-pl-7{padding-left:28px}.lg\:sl-pt-8{padding-top:32px}.lg\:sl-pr-8{padding-right:32px}.lg\:sl-pb-8{padding-bottom:32px}.lg\:sl-pl-8{padding-left:32px}.lg\:sl-pt-9{padding-top:36px}.lg\:sl-pr-9{padding-right:36px}.lg\:sl-pb-9{padding-bottom:36px}.lg\:sl-pl-9{padding-left:36px}.lg\:sl-pt-10{padding-top:40px}.lg\:sl-pr-10{padding-right:40px}.lg\:sl-pb-10{padding-bottom:40px}.lg\:sl-pl-10{padding-left:40px}.lg\:sl-pt-11{padding-top:44px}.lg\:sl-pr-11{padding-right:44px}.lg\:sl-pb-11{padding-bottom:44px}.lg\:sl-pl-11{padding-left:44px}.lg\:sl-pt-12{padding-top:48px}.lg\:sl-pr-12{padding-right:48px}.lg\:sl-pb-12{padding-bottom:48px}.lg\:sl-pl-12{padding-left:48px}.lg\:sl-pt-14{padding-top:56px}.lg\:sl-pr-14{padding-right:56px}.lg\:sl-pb-14{padding-bottom:56px}.lg\:sl-pl-14{padding-left:56px}.lg\:sl-pt-16{padding-top:64px}.lg\:sl-pr-16{padding-right:64px}.lg\:sl-pb-16{padding-bottom:64px}.lg\:sl-pl-16{padding-left:64px}.lg\:sl-pt-20{padding-top:80px}.lg\:sl-pr-20{padding-right:80px}.lg\:sl-pb-20{padding-bottom:80px}.lg\:sl-pl-20{padding-left:80px}.lg\:sl-pt-24{padding-top:96px}.lg\:sl-pr-24{padding-right:96px}.lg\:sl-pb-24{padding-bottom:96px}.lg\:sl-pl-24{padding-left:96px}.lg\:sl-pt-28{padding-top:112px}.lg\:sl-pr-28{padding-right:112px}.lg\:sl-pb-28{padding-bottom:112px}.lg\:sl-pl-28{padding-left:112px}.lg\:sl-pt-32{padding-top:128px}.lg\:sl-pr-32{padding-right:128px}.lg\:sl-pb-32{padding-bottom:128px}.lg\:sl-pl-32{padding-left:128px}.lg\:sl-pt-36{padding-top:144px}.lg\:sl-pr-36{padding-right:144px}.lg\:sl-pb-36{padding-bottom:144px}.lg\:sl-pl-36{padding-left:144px}.lg\:sl-pt-40{padding-top:160px}.lg\:sl-pr-40{padding-right:160px}.lg\:sl-pb-40{padding-bottom:160px}.lg\:sl-pl-40{padding-left:160px}.lg\:sl-pt-44{padding-top:176px}.lg\:sl-pr-44{padding-right:176px}.lg\:sl-pb-44{padding-bottom:176px}.lg\:sl-pl-44{padding-left:176px}.lg\:sl-pt-48{padding-top:192px}.lg\:sl-pr-48{padding-right:192px}.lg\:sl-pb-48{padding-bottom:192px}.lg\:sl-pl-48{padding-left:192px}.lg\:sl-pt-52{padding-top:208px}.lg\:sl-pr-52{padding-right:208px}.lg\:sl-pb-52{padding-bottom:208px}.lg\:sl-pl-52{padding-left:208px}.lg\:sl-pt-56{padding-top:224px}.lg\:sl-pr-56{padding-right:224px}.lg\:sl-pb-56{padding-bottom:224px}.lg\:sl-pl-56{padding-left:224px}.lg\:sl-pt-60{padding-top:240px}.lg\:sl-pr-60{padding-right:240px}.lg\:sl-pb-60{padding-bottom:240px}.lg\:sl-pl-60{padding-left:240px}.lg\:sl-pt-64{padding-top:256px}.lg\:sl-pr-64{padding-right:256px}.lg\:sl-pb-64{padding-bottom:256px}.lg\:sl-pl-64{padding-left:256px}.lg\:sl-pt-72{padding-top:288px}.lg\:sl-pr-72{padding-right:288px}.lg\:sl-pb-72{padding-bottom:288px}.lg\:sl-pl-72{padding-left:288px}.lg\:sl-pt-80{padding-top:320px}.lg\:sl-pr-80{padding-right:320px}.lg\:sl-pb-80{padding-bottom:320px}.lg\:sl-pl-80{padding-left:320px}.lg\:sl-pt-96{padding-top:384px}.lg\:sl-pr-96{padding-right:384px}.lg\:sl-pb-96{padding-bottom:384px}.lg\:sl-pl-96{padding-left:384px}.lg\:sl-pt-px{padding-top:1px}.lg\:sl-pr-px{padding-right:1px}.lg\:sl-pb-px{padding-bottom:1px}.lg\:sl-pl-px{padding-left:1px}.lg\:sl-pt-0\.5{padding-top:2px}.lg\:sl-pr-0\.5{padding-right:2px}.lg\:sl-pb-0\.5{padding-bottom:2px}.lg\:sl-pl-0\.5{padding-left:2px}.lg\:sl-pt-1\.5{padding-top:6px}.lg\:sl-pr-1\.5{padding-right:6px}.lg\:sl-pb-1\.5{padding-bottom:6px}.lg\:sl-pl-1\.5{padding-left:6px}.lg\:sl-pt-2\.5{padding-top:10px}.lg\:sl-pr-2\.5{padding-right:10px}.lg\:sl-pb-2\.5{padding-bottom:10px}.lg\:sl-pl-2\.5{padding-left:10px}.lg\:sl-pt-3\.5{padding-top:14px}.lg\:sl-pr-3\.5{padding-right:14px}.lg\:sl-pb-3\.5{padding-bottom:14px}.lg\:sl-pl-3\.5{padding-left:14px}.lg\:sl-pt-4\.5{padding-top:18px}.lg\:sl-pr-4\.5{padding-right:18px}.lg\:sl-pb-4\.5{padding-bottom:18px}.lg\:sl-pl-4\.5{padding-left:18px}.lg\:sl-static{position:static}.lg\:sl-fixed{position:fixed}.lg\:sl-absolute{position:absolute}.lg\:sl-relative{position:relative}.lg\:sl-sticky{position:-webkit-sticky;position:sticky}.lg\:sl-visible{visibility:visible}.lg\:sl-invisible{visibility:hidden}.sl-group:hover .lg\:group-hover\:sl-visible{visibility:visible}.sl-group:hover .lg\:group-hover\:sl-invisible{visibility:hidden}.sl-group:focus .lg\:group-focus\:sl-visible{visibility:visible}.sl-group:focus .lg\:group-focus\:sl-invisible{visibility:hidden}.lg\:sl-w-0{width:0}.lg\:sl-w-1{width:4px}.lg\:sl-w-2{width:8px}.lg\:sl-w-3{width:12px}.lg\:sl-w-4{width:16px}.lg\:sl-w-5{width:20px}.lg\:sl-w-6{width:24px}.lg\:sl-w-7{width:28px}.lg\:sl-w-8{width:32px}.lg\:sl-w-9{width:36px}.lg\:sl-w-10{width:40px}.lg\:sl-w-11{width:44px}.lg\:sl-w-12{width:48px}.lg\:sl-w-14{width:56px}.lg\:sl-w-16{width:64px}.lg\:sl-w-20{width:80px}.lg\:sl-w-24{width:96px}.lg\:sl-w-28{width:112px}.lg\:sl-w-32{width:128px}.lg\:sl-w-36{width:144px}.lg\:sl-w-40{width:160px}.lg\:sl-w-44{width:176px}.lg\:sl-w-48{width:192px}.lg\:sl-w-52{width:208px}.lg\:sl-w-56{width:224px}.lg\:sl-w-60{width:240px}.lg\:sl-w-64{width:256px}.lg\:sl-w-72{width:288px}.lg\:sl-w-80{width:320px}.lg\:sl-w-96{width:384px}.lg\:sl-w-auto{width:auto}.lg\:sl-w-px{width:1px}.lg\:sl-w-0\.5{width:2px}.lg\:sl-w-1\.5{width:6px}.lg\:sl-w-2\.5{width:10px}.lg\:sl-w-3\.5{width:14px}.lg\:sl-w-4\.5{width:18px}.lg\:sl-w-xs{width:20px}.lg\:sl-w-sm{width:24px}.lg\:sl-w-md{width:32px}.lg\:sl-w-lg{width:36px}.lg\:sl-w-xl{width:44px}.lg\:sl-w-2xl{width:52px}.lg\:sl-w-3xl{width:60px}.lg\:sl-w-1\/2{width:50%}.lg\:sl-w-1\/3{width:33.333333%}.lg\:sl-w-2\/3{width:66.666667%}.lg\:sl-w-1\/4{width:25%}.lg\:sl-w-2\/4{width:50%}.lg\:sl-w-3\/4{width:75%}.lg\:sl-w-1\/5{width:20%}.lg\:sl-w-2\/5{width:40%}.lg\:sl-w-3\/5{width:60%}.lg\:sl-w-4\/5{width:80%}.lg\:sl-w-1\/6{width:16.666667%}.lg\:sl-w-2\/6{width:33.333333%}.lg\:sl-w-3\/6{width:50%}.lg\:sl-w-4\/6{width:66.666667%}.lg\:sl-w-5\/6{width:83.333333%}.lg\:sl-w-full{width:100%}.lg\:sl-w-screen{width:100vw}.lg\:sl-w-min{width:-moz-min-content;width:min-content}.lg\:sl-w-max{width:-moz-max-content;width:max-content}}@media (max-width:1399px){.sl-stack--horizontal.xl\:sl-stack--1>:not(style)~:not(style){margin-left:4px}.sl-stack--vertical.xl\:sl-stack--1>:not(style)~:not(style){margin-top:4px}.sl-stack--horizontal.xl\:sl-stack--2>:not(style)~:not(style){margin-left:8px}.sl-stack--vertical.xl\:sl-stack--2>:not(style)~:not(style){margin-top:8px}.sl-stack--horizontal.xl\:sl-stack--3>:not(style)~:not(style){margin-left:12px}.sl-stack--vertical.xl\:sl-stack--3>:not(style)~:not(style){margin-top:12px}.sl-stack--horizontal.xl\:sl-stack--4>:not(style)~:not(style){margin-left:16px}.sl-stack--vertical.xl\:sl-stack--4>:not(style)~:not(style){margin-top:16px}.sl-stack--horizontal.xl\:sl-stack--5>:not(style)~:not(style){margin-left:20px}.sl-stack--vertical.xl\:sl-stack--5>:not(style)~:not(style){margin-top:20px}.sl-stack--horizontal.xl\:sl-stack--6>:not(style)~:not(style){margin-left:24px}.sl-stack--vertical.xl\:sl-stack--6>:not(style)~:not(style){margin-top:24px}.sl-stack--horizontal.xl\:sl-stack--7>:not(style)~:not(style){margin-left:28px}.sl-stack--vertical.xl\:sl-stack--7>:not(style)~:not(style){margin-top:28px}.sl-stack--horizontal.xl\:sl-stack--8>:not(style)~:not(style){margin-left:32px}.sl-stack--vertical.xl\:sl-stack--8>:not(style)~:not(style){margin-top:32px}.sl-stack--horizontal.xl\:sl-stack--9>:not(style)~:not(style){margin-left:36px}.sl-stack--vertical.xl\:sl-stack--9>:not(style)~:not(style){margin-top:36px}.sl-stack--horizontal.xl\:sl-stack--10>:not(style)~:not(style){margin-left:40px}.sl-stack--vertical.xl\:sl-stack--10>:not(style)~:not(style){margin-top:40px}.sl-stack--horizontal.xl\:sl-stack--12>:not(style)~:not(style){margin-left:48px}.sl-stack--vertical.xl\:sl-stack--12>:not(style)~:not(style){margin-top:48px}.sl-stack--horizontal.xl\:sl-stack--14>:not(style)~:not(style){margin-left:56px}.sl-stack--vertical.xl\:sl-stack--14>:not(style)~:not(style){margin-top:56px}.sl-stack--horizontal.xl\:sl-stack--16>:not(style)~:not(style){margin-left:64px}.sl-stack--vertical.xl\:sl-stack--16>:not(style)~:not(style){margin-top:64px}.sl-stack--horizontal.xl\:sl-stack--20>:not(style)~:not(style){margin-left:80px}.sl-stack--vertical.xl\:sl-stack--20>:not(style)~:not(style){margin-top:80px}.sl-stack--horizontal.xl\:sl-stack--24>:not(style)~:not(style){margin-left:96px}.sl-stack--vertical.xl\:sl-stack--24>:not(style)~:not(style){margin-top:96px}.sl-stack--horizontal.xl\:sl-stack--32>:not(style)~:not(style){margin-left:128px}.sl-stack--vertical.xl\:sl-stack--32>:not(style)~:not(style){margin-top:128px}.xl\:sl-content-center{align-content:center}.xl\:sl-content-start{align-content:flex-start}.xl\:sl-content-end{align-content:flex-end}.xl\:sl-content-between{align-content:space-between}.xl\:sl-content-around{align-content:space-around}.xl\:sl-content-evenly{align-content:space-evenly}.xl\:sl-items-start{align-items:flex-start}.xl\:sl-items-end{align-items:flex-end}.xl\:sl-items-center{align-items:center}.xl\:sl-items-baseline{align-items:baseline}.xl\:sl-items-stretch{align-items:stretch}.xl\:sl-self-auto{align-self:auto}.xl\:sl-self-start{align-self:flex-start}.xl\:sl-self-end{align-self:flex-end}.xl\:sl-self-center{align-self:center}.xl\:sl-self-stretch{align-self:stretch}.xl\:sl-blur-0,.xl\:sl-blur-none{--tw-blur:blur(0)}.xl\:sl-blur-sm{--tw-blur:blur(4px)}.xl\:sl-blur{--tw-blur:blur(8px)}.xl\:sl-blur-md{--tw-blur:blur(12px)}.xl\:sl-blur-lg{--tw-blur:blur(16px)}.xl\:sl-blur-xl{--tw-blur:blur(24px)}.xl\:sl-blur-2xl{--tw-blur:blur(40px)}.xl\:sl-blur-3xl{--tw-blur:blur(64px)}.xl\:sl-block{display:block}.xl\:sl-inline-block{display:inline-block}.xl\:sl-inline{display:inline}.xl\:sl-flex{display:flex}.xl\:sl-inline-flex{display:inline-flex}.xl\:sl-table{display:table}.xl\:sl-inline-table{display:inline-table}.xl\:sl-table-caption{display:table-caption}.xl\:sl-table-cell{display:table-cell}.xl\:sl-table-column{display:table-column}.xl\:sl-table-column-group{display:table-column-group}.xl\:sl-table-footer-group{display:table-footer-group}.xl\:sl-table-header-group{display:table-header-group}.xl\:sl-table-row-group{display:table-row-group}.xl\:sl-table-row{display:table-row}.xl\:sl-flow-root{display:flow-root}.xl\:sl-grid{display:grid}.xl\:sl-inline-grid{display:inline-grid}.xl\:sl-contents{display:contents}.xl\:sl-list-item{display:list-item}.xl\:sl-hidden{display:none}.xl\:sl-drop-shadow{--tw-drop-shadow:drop-shadow(var(--drop-shadow-default1)) drop-shadow(var(--drop-shadow-default2))}.xl\:sl-flex-1{flex:1 1}.xl\:sl-flex-auto{flex:1 1 auto}.xl\:sl-flex-initial{flex:0 1 auto}.xl\:sl-flex-none{flex:none}.xl\:sl-flex-row{flex-direction:row}.xl\:sl-flex-row-reverse{flex-direction:row-reverse}.xl\:sl-flex-col{flex-direction:column}.xl\:sl-flex-col-reverse{flex-direction:column-reverse}.xl\:sl-flex-grow-0{flex-grow:0}.xl\:sl-flex-grow{flex-grow:1}.xl\:sl-flex-shrink-0{flex-shrink:0}.xl\:sl-flex-shrink{flex-shrink:1}.xl\:sl-flex-wrap{flex-wrap:wrap}.xl\:sl-flex-wrap-reverse{flex-wrap:wrap-reverse}.xl\:sl-flex-nowrap{flex-wrap:nowrap}.xl\:sl-h-0{height:0}.xl\:sl-h-1{height:4px}.xl\:sl-h-2{height:8px}.xl\:sl-h-3{height:12px}.xl\:sl-h-4{height:16px}.xl\:sl-h-5{height:20px}.xl\:sl-h-6{height:24px}.xl\:sl-h-7{height:28px}.xl\:sl-h-8{height:32px}.xl\:sl-h-9{height:36px}.xl\:sl-h-10{height:40px}.xl\:sl-h-11{height:44px}.xl\:sl-h-12{height:48px}.xl\:sl-h-14{height:56px}.xl\:sl-h-16{height:64px}.xl\:sl-h-20{height:80px}.xl\:sl-h-24{height:96px}.xl\:sl-h-28{height:112px}.xl\:sl-h-32{height:128px}.xl\:sl-h-36{height:144px}.xl\:sl-h-40{height:160px}.xl\:sl-h-44{height:176px}.xl\:sl-h-48{height:192px}.xl\:sl-h-52{height:208px}.xl\:sl-h-56{height:224px}.xl\:sl-h-60{height:240px}.xl\:sl-h-64{height:256px}.xl\:sl-h-72{height:288px}.xl\:sl-h-80{height:320px}.xl\:sl-h-96{height:384px}.xl\:sl-h-auto{height:auto}.xl\:sl-h-px{height:1px}.xl\:sl-h-0\.5{height:2px}.xl\:sl-h-1\.5{height:6px}.xl\:sl-h-2\.5{height:10px}.xl\:sl-h-3\.5{height:14px}.xl\:sl-h-4\.5{height:18px}.xl\:sl-h-xs{height:20px}.xl\:sl-h-sm{height:24px}.xl\:sl-h-md{height:32px}.xl\:sl-h-lg{height:36px}.xl\:sl-h-xl{height:44px}.xl\:sl-h-2xl{height:52px}.xl\:sl-h-3xl{height:60px}.xl\:sl-h-full{height:100%}.xl\:sl-h-screen{height:100vh}.xl\:sl-justify-start{justify-content:flex-start}.xl\:sl-justify-end{justify-content:flex-end}.xl\:sl-justify-center{justify-content:center}.xl\:sl-justify-between{justify-content:space-between}.xl\:sl-justify-around{justify-content:space-around}.xl\:sl-justify-evenly{justify-content:space-evenly}.xl\:sl-justify-items-start{justify-items:start}.xl\:sl-justify-items-end{justify-items:end}.xl\:sl-justify-items-center{justify-items:center}.xl\:sl-justify-items-stretch{justify-items:stretch}.xl\:sl-justify-self-auto{justify-self:auto}.xl\:sl-justify-self-start{justify-self:start}.xl\:sl-justify-self-end{justify-self:end}.xl\:sl-justify-self-center{justify-self:center}.xl\:sl-justify-self-stretch{justify-self:stretch}.xl\:sl-m-0{margin:0}.xl\:sl-m-1{margin:4px}.xl\:sl-m-2{margin:8px}.xl\:sl-m-3{margin:12px}.xl\:sl-m-4{margin:16px}.xl\:sl-m-5{margin:20px}.xl\:sl-m-6{margin:24px}.xl\:sl-m-7{margin:28px}.xl\:sl-m-8{margin:32px}.xl\:sl-m-9{margin:36px}.xl\:sl-m-10{margin:40px}.xl\:sl-m-11{margin:44px}.xl\:sl-m-12{margin:48px}.xl\:sl-m-14{margin:56px}.xl\:sl-m-16{margin:64px}.xl\:sl-m-20{margin:80px}.xl\:sl-m-24{margin:96px}.xl\:sl-m-28{margin:112px}.xl\:sl-m-32{margin:128px}.xl\:sl-m-36{margin:144px}.xl\:sl-m-40{margin:160px}.xl\:sl-m-44{margin:176px}.xl\:sl-m-48{margin:192px}.xl\:sl-m-52{margin:208px}.xl\:sl-m-56{margin:224px}.xl\:sl-m-60{margin:240px}.xl\:sl-m-64{margin:256px}.xl\:sl-m-72{margin:288px}.xl\:sl-m-80{margin:320px}.xl\:sl-m-96{margin:384px}.xl\:sl-m-auto{margin:auto}.xl\:sl-m-px{margin:1px}.xl\:sl-m-0\.5{margin:2px}.xl\:sl-m-1\.5{margin:6px}.xl\:sl-m-2\.5{margin:10px}.xl\:sl-m-3\.5{margin:14px}.xl\:sl-m-4\.5{margin:18px}.xl\:sl--m-0{margin:0}.xl\:sl--m-1{margin:-4px}.xl\:sl--m-2{margin:-8px}.xl\:sl--m-3{margin:-12px}.xl\:sl--m-4{margin:-16px}.xl\:sl--m-5{margin:-20px}.xl\:sl--m-6{margin:-24px}.xl\:sl--m-7{margin:-28px}.xl\:sl--m-8{margin:-32px}.xl\:sl--m-9{margin:-36px}.xl\:sl--m-10{margin:-40px}.xl\:sl--m-11{margin:-44px}.xl\:sl--m-12{margin:-48px}.xl\:sl--m-14{margin:-56px}.xl\:sl--m-16{margin:-64px}.xl\:sl--m-20{margin:-80px}.xl\:sl--m-24{margin:-96px}.xl\:sl--m-28{margin:-112px}.xl\:sl--m-32{margin:-128px}.xl\:sl--m-36{margin:-144px}.xl\:sl--m-40{margin:-160px}.xl\:sl--m-44{margin:-176px}.xl\:sl--m-48{margin:-192px}.xl\:sl--m-52{margin:-208px}.xl\:sl--m-56{margin:-224px}.xl\:sl--m-60{margin:-240px}.xl\:sl--m-64{margin:-256px}.xl\:sl--m-72{margin:-288px}.xl\:sl--m-80{margin:-320px}.xl\:sl--m-96{margin:-384px}.xl\:sl--m-px{margin:-1px}.xl\:sl--m-0\.5{margin:-2px}.xl\:sl--m-1\.5{margin:-6px}.xl\:sl--m-2\.5{margin:-10px}.xl\:sl--m-3\.5{margin:-14px}.xl\:sl--m-4\.5{margin:-18px}.xl\:sl-my-0{margin-bottom:0;margin-top:0}.xl\:sl-mx-0{margin-left:0;margin-right:0}.xl\:sl-my-1{margin-bottom:4px;margin-top:4px}.xl\:sl-mx-1{margin-left:4px;margin-right:4px}.xl\:sl-my-2{margin-bottom:8px;margin-top:8px}.xl\:sl-mx-2{margin-left:8px;margin-right:8px}.xl\:sl-my-3{margin-bottom:12px;margin-top:12px}.xl\:sl-mx-3{margin-left:12px;margin-right:12px}.xl\:sl-my-4{margin-bottom:16px;margin-top:16px}.xl\:sl-mx-4{margin-left:16px;margin-right:16px}.xl\:sl-my-5{margin-bottom:20px;margin-top:20px}.xl\:sl-mx-5{margin-left:20px;margin-right:20px}.xl\:sl-my-6{margin-bottom:24px;margin-top:24px}.xl\:sl-mx-6{margin-left:24px;margin-right:24px}.xl\:sl-my-7{margin-bottom:28px;margin-top:28px}.xl\:sl-mx-7{margin-left:28px;margin-right:28px}.xl\:sl-my-8{margin-bottom:32px;margin-top:32px}.xl\:sl-mx-8{margin-left:32px;margin-right:32px}.xl\:sl-my-9{margin-bottom:36px;margin-top:36px}.xl\:sl-mx-9{margin-left:36px;margin-right:36px}.xl\:sl-my-10{margin-bottom:40px;margin-top:40px}.xl\:sl-mx-10{margin-left:40px;margin-right:40px}.xl\:sl-my-11{margin-bottom:44px;margin-top:44px}.xl\:sl-mx-11{margin-left:44px;margin-right:44px}.xl\:sl-my-12{margin-bottom:48px;margin-top:48px}.xl\:sl-mx-12{margin-left:48px;margin-right:48px}.xl\:sl-my-14{margin-bottom:56px;margin-top:56px}.xl\:sl-mx-14{margin-left:56px;margin-right:56px}.xl\:sl-my-16{margin-bottom:64px;margin-top:64px}.xl\:sl-mx-16{margin-left:64px;margin-right:64px}.xl\:sl-my-20{margin-bottom:80px;margin-top:80px}.xl\:sl-mx-20{margin-left:80px;margin-right:80px}.xl\:sl-my-24{margin-bottom:96px;margin-top:96px}.xl\:sl-mx-24{margin-left:96px;margin-right:96px}.xl\:sl-my-28{margin-bottom:112px;margin-top:112px}.xl\:sl-mx-28{margin-left:112px;margin-right:112px}.xl\:sl-my-32{margin-bottom:128px;margin-top:128px}.xl\:sl-mx-32{margin-left:128px;margin-right:128px}.xl\:sl-my-36{margin-bottom:144px;margin-top:144px}.xl\:sl-mx-36{margin-left:144px;margin-right:144px}.xl\:sl-my-40{margin-bottom:160px;margin-top:160px}.xl\:sl-mx-40{margin-left:160px;margin-right:160px}.xl\:sl-my-44{margin-bottom:176px;margin-top:176px}.xl\:sl-mx-44{margin-left:176px;margin-right:176px}.xl\:sl-my-48{margin-bottom:192px;margin-top:192px}.xl\:sl-mx-48{margin-left:192px;margin-right:192px}.xl\:sl-my-52{margin-bottom:208px;margin-top:208px}.xl\:sl-mx-52{margin-left:208px;margin-right:208px}.xl\:sl-my-56{margin-bottom:224px;margin-top:224px}.xl\:sl-mx-56{margin-left:224px;margin-right:224px}.xl\:sl-my-60{margin-bottom:240px;margin-top:240px}.xl\:sl-mx-60{margin-left:240px;margin-right:240px}.xl\:sl-my-64{margin-bottom:256px;margin-top:256px}.xl\:sl-mx-64{margin-left:256px;margin-right:256px}.xl\:sl-my-72{margin-bottom:288px;margin-top:288px}.xl\:sl-mx-72{margin-left:288px;margin-right:288px}.xl\:sl-my-80{margin-bottom:320px;margin-top:320px}.xl\:sl-mx-80{margin-left:320px;margin-right:320px}.xl\:sl-my-96{margin-bottom:384px;margin-top:384px}.xl\:sl-mx-96{margin-left:384px;margin-right:384px}.xl\:sl-my-auto{margin-bottom:auto;margin-top:auto}.xl\:sl-mx-auto{margin-left:auto;margin-right:auto}.xl\:sl-my-px{margin-bottom:1px;margin-top:1px}.xl\:sl-mx-px{margin-left:1px;margin-right:1px}.xl\:sl-my-0\.5{margin-bottom:2px;margin-top:2px}.xl\:sl-mx-0\.5{margin-left:2px;margin-right:2px}.xl\:sl-my-1\.5{margin-bottom:6px;margin-top:6px}.xl\:sl-mx-1\.5{margin-left:6px;margin-right:6px}.xl\:sl-my-2\.5{margin-bottom:10px;margin-top:10px}.xl\:sl-mx-2\.5{margin-left:10px;margin-right:10px}.xl\:sl-my-3\.5{margin-bottom:14px;margin-top:14px}.xl\:sl-mx-3\.5{margin-left:14px;margin-right:14px}.xl\:sl-my-4\.5{margin-bottom:18px;margin-top:18px}.xl\:sl-mx-4\.5{margin-left:18px;margin-right:18px}.xl\:sl--my-0{margin-bottom:0;margin-top:0}.xl\:sl--mx-0{margin-left:0;margin-right:0}.xl\:sl--my-1{margin-bottom:-4px;margin-top:-4px}.xl\:sl--mx-1{margin-left:-4px;margin-right:-4px}.xl\:sl--my-2{margin-bottom:-8px;margin-top:-8px}.xl\:sl--mx-2{margin-left:-8px;margin-right:-8px}.xl\:sl--my-3{margin-bottom:-12px;margin-top:-12px}.xl\:sl--mx-3{margin-left:-12px;margin-right:-12px}.xl\:sl--my-4{margin-bottom:-16px;margin-top:-16px}.xl\:sl--mx-4{margin-left:-16px;margin-right:-16px}.xl\:sl--my-5{margin-bottom:-20px;margin-top:-20px}.xl\:sl--mx-5{margin-left:-20px;margin-right:-20px}.xl\:sl--my-6{margin-bottom:-24px;margin-top:-24px}.xl\:sl--mx-6{margin-left:-24px;margin-right:-24px}.xl\:sl--my-7{margin-bottom:-28px;margin-top:-28px}.xl\:sl--mx-7{margin-left:-28px;margin-right:-28px}.xl\:sl--my-8{margin-bottom:-32px;margin-top:-32px}.xl\:sl--mx-8{margin-left:-32px;margin-right:-32px}.xl\:sl--my-9{margin-bottom:-36px;margin-top:-36px}.xl\:sl--mx-9{margin-left:-36px;margin-right:-36px}.xl\:sl--my-10{margin-bottom:-40px;margin-top:-40px}.xl\:sl--mx-10{margin-left:-40px;margin-right:-40px}.xl\:sl--my-11{margin-bottom:-44px;margin-top:-44px}.xl\:sl--mx-11{margin-left:-44px;margin-right:-44px}.xl\:sl--my-12{margin-bottom:-48px;margin-top:-48px}.xl\:sl--mx-12{margin-left:-48px;margin-right:-48px}.xl\:sl--my-14{margin-bottom:-56px;margin-top:-56px}.xl\:sl--mx-14{margin-left:-56px;margin-right:-56px}.xl\:sl--my-16{margin-bottom:-64px;margin-top:-64px}.xl\:sl--mx-16{margin-left:-64px;margin-right:-64px}.xl\:sl--my-20{margin-bottom:-80px;margin-top:-80px}.xl\:sl--mx-20{margin-left:-80px;margin-right:-80px}.xl\:sl--my-24{margin-bottom:-96px;margin-top:-96px}.xl\:sl--mx-24{margin-left:-96px;margin-right:-96px}.xl\:sl--my-28{margin-bottom:-112px;margin-top:-112px}.xl\:sl--mx-28{margin-left:-112px;margin-right:-112px}.xl\:sl--my-32{margin-bottom:-128px;margin-top:-128px}.xl\:sl--mx-32{margin-left:-128px;margin-right:-128px}.xl\:sl--my-36{margin-bottom:-144px;margin-top:-144px}.xl\:sl--mx-36{margin-left:-144px;margin-right:-144px}.xl\:sl--my-40{margin-bottom:-160px;margin-top:-160px}.xl\:sl--mx-40{margin-left:-160px;margin-right:-160px}.xl\:sl--my-44{margin-bottom:-176px;margin-top:-176px}.xl\:sl--mx-44{margin-left:-176px;margin-right:-176px}.xl\:sl--my-48{margin-bottom:-192px;margin-top:-192px}.xl\:sl--mx-48{margin-left:-192px;margin-right:-192px}.xl\:sl--my-52{margin-bottom:-208px;margin-top:-208px}.xl\:sl--mx-52{margin-left:-208px;margin-right:-208px}.xl\:sl--my-56{margin-bottom:-224px;margin-top:-224px}.xl\:sl--mx-56{margin-left:-224px;margin-right:-224px}.xl\:sl--my-60{margin-bottom:-240px;margin-top:-240px}.xl\:sl--mx-60{margin-left:-240px;margin-right:-240px}.xl\:sl--my-64{margin-bottom:-256px;margin-top:-256px}.xl\:sl--mx-64{margin-left:-256px;margin-right:-256px}.xl\:sl--my-72{margin-bottom:-288px;margin-top:-288px}.xl\:sl--mx-72{margin-left:-288px;margin-right:-288px}.xl\:sl--my-80{margin-bottom:-320px;margin-top:-320px}.xl\:sl--mx-80{margin-left:-320px;margin-right:-320px}.xl\:sl--my-96{margin-bottom:-384px;margin-top:-384px}.xl\:sl--mx-96{margin-left:-384px;margin-right:-384px}.xl\:sl--my-px{margin-bottom:-1px;margin-top:-1px}.xl\:sl--mx-px{margin-left:-1px;margin-right:-1px}.xl\:sl--my-0\.5{margin-bottom:-2px;margin-top:-2px}.xl\:sl--mx-0\.5{margin-left:-2px;margin-right:-2px}.xl\:sl--my-1\.5{margin-bottom:-6px;margin-top:-6px}.xl\:sl--mx-1\.5{margin-left:-6px;margin-right:-6px}.xl\:sl--my-2\.5{margin-bottom:-10px;margin-top:-10px}.xl\:sl--mx-2\.5{margin-left:-10px;margin-right:-10px}.xl\:sl--my-3\.5{margin-bottom:-14px;margin-top:-14px}.xl\:sl--mx-3\.5{margin-left:-14px;margin-right:-14px}.xl\:sl--my-4\.5{margin-bottom:-18px;margin-top:-18px}.xl\:sl--mx-4\.5{margin-left:-18px;margin-right:-18px}.xl\:sl-mt-0{margin-top:0}.xl\:sl-mr-0{margin-right:0}.xl\:sl-mb-0{margin-bottom:0}.xl\:sl-ml-0{margin-left:0}.xl\:sl-mt-1{margin-top:4px}.xl\:sl-mr-1{margin-right:4px}.xl\:sl-mb-1{margin-bottom:4px}.xl\:sl-ml-1{margin-left:4px}.xl\:sl-mt-2{margin-top:8px}.xl\:sl-mr-2{margin-right:8px}.xl\:sl-mb-2{margin-bottom:8px}.xl\:sl-ml-2{margin-left:8px}.xl\:sl-mt-3{margin-top:12px}.xl\:sl-mr-3{margin-right:12px}.xl\:sl-mb-3{margin-bottom:12px}.xl\:sl-ml-3{margin-left:12px}.xl\:sl-mt-4{margin-top:16px}.xl\:sl-mr-4{margin-right:16px}.xl\:sl-mb-4{margin-bottom:16px}.xl\:sl-ml-4{margin-left:16px}.xl\:sl-mt-5{margin-top:20px}.xl\:sl-mr-5{margin-right:20px}.xl\:sl-mb-5{margin-bottom:20px}.xl\:sl-ml-5{margin-left:20px}.xl\:sl-mt-6{margin-top:24px}.xl\:sl-mr-6{margin-right:24px}.xl\:sl-mb-6{margin-bottom:24px}.xl\:sl-ml-6{margin-left:24px}.xl\:sl-mt-7{margin-top:28px}.xl\:sl-mr-7{margin-right:28px}.xl\:sl-mb-7{margin-bottom:28px}.xl\:sl-ml-7{margin-left:28px}.xl\:sl-mt-8{margin-top:32px}.xl\:sl-mr-8{margin-right:32px}.xl\:sl-mb-8{margin-bottom:32px}.xl\:sl-ml-8{margin-left:32px}.xl\:sl-mt-9{margin-top:36px}.xl\:sl-mr-9{margin-right:36px}.xl\:sl-mb-9{margin-bottom:36px}.xl\:sl-ml-9{margin-left:36px}.xl\:sl-mt-10{margin-top:40px}.xl\:sl-mr-10{margin-right:40px}.xl\:sl-mb-10{margin-bottom:40px}.xl\:sl-ml-10{margin-left:40px}.xl\:sl-mt-11{margin-top:44px}.xl\:sl-mr-11{margin-right:44px}.xl\:sl-mb-11{margin-bottom:44px}.xl\:sl-ml-11{margin-left:44px}.xl\:sl-mt-12{margin-top:48px}.xl\:sl-mr-12{margin-right:48px}.xl\:sl-mb-12{margin-bottom:48px}.xl\:sl-ml-12{margin-left:48px}.xl\:sl-mt-14{margin-top:56px}.xl\:sl-mr-14{margin-right:56px}.xl\:sl-mb-14{margin-bottom:56px}.xl\:sl-ml-14{margin-left:56px}.xl\:sl-mt-16{margin-top:64px}.xl\:sl-mr-16{margin-right:64px}.xl\:sl-mb-16{margin-bottom:64px}.xl\:sl-ml-16{margin-left:64px}.xl\:sl-mt-20{margin-top:80px}.xl\:sl-mr-20{margin-right:80px}.xl\:sl-mb-20{margin-bottom:80px}.xl\:sl-ml-20{margin-left:80px}.xl\:sl-mt-24{margin-top:96px}.xl\:sl-mr-24{margin-right:96px}.xl\:sl-mb-24{margin-bottom:96px}.xl\:sl-ml-24{margin-left:96px}.xl\:sl-mt-28{margin-top:112px}.xl\:sl-mr-28{margin-right:112px}.xl\:sl-mb-28{margin-bottom:112px}.xl\:sl-ml-28{margin-left:112px}.xl\:sl-mt-32{margin-top:128px}.xl\:sl-mr-32{margin-right:128px}.xl\:sl-mb-32{margin-bottom:128px}.xl\:sl-ml-32{margin-left:128px}.xl\:sl-mt-36{margin-top:144px}.xl\:sl-mr-36{margin-right:144px}.xl\:sl-mb-36{margin-bottom:144px}.xl\:sl-ml-36{margin-left:144px}.xl\:sl-mt-40{margin-top:160px}.xl\:sl-mr-40{margin-right:160px}.xl\:sl-mb-40{margin-bottom:160px}.xl\:sl-ml-40{margin-left:160px}.xl\:sl-mt-44{margin-top:176px}.xl\:sl-mr-44{margin-right:176px}.xl\:sl-mb-44{margin-bottom:176px}.xl\:sl-ml-44{margin-left:176px}.xl\:sl-mt-48{margin-top:192px}.xl\:sl-mr-48{margin-right:192px}.xl\:sl-mb-48{margin-bottom:192px}.xl\:sl-ml-48{margin-left:192px}.xl\:sl-mt-52{margin-top:208px}.xl\:sl-mr-52{margin-right:208px}.xl\:sl-mb-52{margin-bottom:208px}.xl\:sl-ml-52{margin-left:208px}.xl\:sl-mt-56{margin-top:224px}.xl\:sl-mr-56{margin-right:224px}.xl\:sl-mb-56{margin-bottom:224px}.xl\:sl-ml-56{margin-left:224px}.xl\:sl-mt-60{margin-top:240px}.xl\:sl-mr-60{margin-right:240px}.xl\:sl-mb-60{margin-bottom:240px}.xl\:sl-ml-60{margin-left:240px}.xl\:sl-mt-64{margin-top:256px}.xl\:sl-mr-64{margin-right:256px}.xl\:sl-mb-64{margin-bottom:256px}.xl\:sl-ml-64{margin-left:256px}.xl\:sl-mt-72{margin-top:288px}.xl\:sl-mr-72{margin-right:288px}.xl\:sl-mb-72{margin-bottom:288px}.xl\:sl-ml-72{margin-left:288px}.xl\:sl-mt-80{margin-top:320px}.xl\:sl-mr-80{margin-right:320px}.xl\:sl-mb-80{margin-bottom:320px}.xl\:sl-ml-80{margin-left:320px}.xl\:sl-mt-96{margin-top:384px}.xl\:sl-mr-96{margin-right:384px}.xl\:sl-mb-96{margin-bottom:384px}.xl\:sl-ml-96{margin-left:384px}.xl\:sl-mt-auto{margin-top:auto}.xl\:sl-mr-auto{margin-right:auto}.xl\:sl-mb-auto{margin-bottom:auto}.xl\:sl-ml-auto{margin-left:auto}.xl\:sl-mt-px{margin-top:1px}.xl\:sl-mr-px{margin-right:1px}.xl\:sl-mb-px{margin-bottom:1px}.xl\:sl-ml-px{margin-left:1px}.xl\:sl-mt-0\.5{margin-top:2px}.xl\:sl-mr-0\.5{margin-right:2px}.xl\:sl-mb-0\.5{margin-bottom:2px}.xl\:sl-ml-0\.5{margin-left:2px}.xl\:sl-mt-1\.5{margin-top:6px}.xl\:sl-mr-1\.5{margin-right:6px}.xl\:sl-mb-1\.5{margin-bottom:6px}.xl\:sl-ml-1\.5{margin-left:6px}.xl\:sl-mt-2\.5{margin-top:10px}.xl\:sl-mr-2\.5{margin-right:10px}.xl\:sl-mb-2\.5{margin-bottom:10px}.xl\:sl-ml-2\.5{margin-left:10px}.xl\:sl-mt-3\.5{margin-top:14px}.xl\:sl-mr-3\.5{margin-right:14px}.xl\:sl-mb-3\.5{margin-bottom:14px}.xl\:sl-ml-3\.5{margin-left:14px}.xl\:sl-mt-4\.5{margin-top:18px}.xl\:sl-mr-4\.5{margin-right:18px}.xl\:sl-mb-4\.5{margin-bottom:18px}.xl\:sl-ml-4\.5{margin-left:18px}.xl\:sl--mt-0{margin-top:0}.xl\:sl--mr-0{margin-right:0}.xl\:sl--mb-0{margin-bottom:0}.xl\:sl--ml-0{margin-left:0}.xl\:sl--mt-1{margin-top:-4px}.xl\:sl--mr-1{margin-right:-4px}.xl\:sl--mb-1{margin-bottom:-4px}.xl\:sl--ml-1{margin-left:-4px}.xl\:sl--mt-2{margin-top:-8px}.xl\:sl--mr-2{margin-right:-8px}.xl\:sl--mb-2{margin-bottom:-8px}.xl\:sl--ml-2{margin-left:-8px}.xl\:sl--mt-3{margin-top:-12px}.xl\:sl--mr-3{margin-right:-12px}.xl\:sl--mb-3{margin-bottom:-12px}.xl\:sl--ml-3{margin-left:-12px}.xl\:sl--mt-4{margin-top:-16px}.xl\:sl--mr-4{margin-right:-16px}.xl\:sl--mb-4{margin-bottom:-16px}.xl\:sl--ml-4{margin-left:-16px}.xl\:sl--mt-5{margin-top:-20px}.xl\:sl--mr-5{margin-right:-20px}.xl\:sl--mb-5{margin-bottom:-20px}.xl\:sl--ml-5{margin-left:-20px}.xl\:sl--mt-6{margin-top:-24px}.xl\:sl--mr-6{margin-right:-24px}.xl\:sl--mb-6{margin-bottom:-24px}.xl\:sl--ml-6{margin-left:-24px}.xl\:sl--mt-7{margin-top:-28px}.xl\:sl--mr-7{margin-right:-28px}.xl\:sl--mb-7{margin-bottom:-28px}.xl\:sl--ml-7{margin-left:-28px}.xl\:sl--mt-8{margin-top:-32px}.xl\:sl--mr-8{margin-right:-32px}.xl\:sl--mb-8{margin-bottom:-32px}.xl\:sl--ml-8{margin-left:-32px}.xl\:sl--mt-9{margin-top:-36px}.xl\:sl--mr-9{margin-right:-36px}.xl\:sl--mb-9{margin-bottom:-36px}.xl\:sl--ml-9{margin-left:-36px}.xl\:sl--mt-10{margin-top:-40px}.xl\:sl--mr-10{margin-right:-40px}.xl\:sl--mb-10{margin-bottom:-40px}.xl\:sl--ml-10{margin-left:-40px}.xl\:sl--mt-11{margin-top:-44px}.xl\:sl--mr-11{margin-right:-44px}.xl\:sl--mb-11{margin-bottom:-44px}.xl\:sl--ml-11{margin-left:-44px}.xl\:sl--mt-12{margin-top:-48px}.xl\:sl--mr-12{margin-right:-48px}.xl\:sl--mb-12{margin-bottom:-48px}.xl\:sl--ml-12{margin-left:-48px}.xl\:sl--mt-14{margin-top:-56px}.xl\:sl--mr-14{margin-right:-56px}.xl\:sl--mb-14{margin-bottom:-56px}.xl\:sl--ml-14{margin-left:-56px}.xl\:sl--mt-16{margin-top:-64px}.xl\:sl--mr-16{margin-right:-64px}.xl\:sl--mb-16{margin-bottom:-64px}.xl\:sl--ml-16{margin-left:-64px}.xl\:sl--mt-20{margin-top:-80px}.xl\:sl--mr-20{margin-right:-80px}.xl\:sl--mb-20{margin-bottom:-80px}.xl\:sl--ml-20{margin-left:-80px}.xl\:sl--mt-24{margin-top:-96px}.xl\:sl--mr-24{margin-right:-96px}.xl\:sl--mb-24{margin-bottom:-96px}.xl\:sl--ml-24{margin-left:-96px}.xl\:sl--mt-28{margin-top:-112px}.xl\:sl--mr-28{margin-right:-112px}.xl\:sl--mb-28{margin-bottom:-112px}.xl\:sl--ml-28{margin-left:-112px}.xl\:sl--mt-32{margin-top:-128px}.xl\:sl--mr-32{margin-right:-128px}.xl\:sl--mb-32{margin-bottom:-128px}.xl\:sl--ml-32{margin-left:-128px}.xl\:sl--mt-36{margin-top:-144px}.xl\:sl--mr-36{margin-right:-144px}.xl\:sl--mb-36{margin-bottom:-144px}.xl\:sl--ml-36{margin-left:-144px}.xl\:sl--mt-40{margin-top:-160px}.xl\:sl--mr-40{margin-right:-160px}.xl\:sl--mb-40{margin-bottom:-160px}.xl\:sl--ml-40{margin-left:-160px}.xl\:sl--mt-44{margin-top:-176px}.xl\:sl--mr-44{margin-right:-176px}.xl\:sl--mb-44{margin-bottom:-176px}.xl\:sl--ml-44{margin-left:-176px}.xl\:sl--mt-48{margin-top:-192px}.xl\:sl--mr-48{margin-right:-192px}.xl\:sl--mb-48{margin-bottom:-192px}.xl\:sl--ml-48{margin-left:-192px}.xl\:sl--mt-52{margin-top:-208px}.xl\:sl--mr-52{margin-right:-208px}.xl\:sl--mb-52{margin-bottom:-208px}.xl\:sl--ml-52{margin-left:-208px}.xl\:sl--mt-56{margin-top:-224px}.xl\:sl--mr-56{margin-right:-224px}.xl\:sl--mb-56{margin-bottom:-224px}.xl\:sl--ml-56{margin-left:-224px}.xl\:sl--mt-60{margin-top:-240px}.xl\:sl--mr-60{margin-right:-240px}.xl\:sl--mb-60{margin-bottom:-240px}.xl\:sl--ml-60{margin-left:-240px}.xl\:sl--mt-64{margin-top:-256px}.xl\:sl--mr-64{margin-right:-256px}.xl\:sl--mb-64{margin-bottom:-256px}.xl\:sl--ml-64{margin-left:-256px}.xl\:sl--mt-72{margin-top:-288px}.xl\:sl--mr-72{margin-right:-288px}.xl\:sl--mb-72{margin-bottom:-288px}.xl\:sl--ml-72{margin-left:-288px}.xl\:sl--mt-80{margin-top:-320px}.xl\:sl--mr-80{margin-right:-320px}.xl\:sl--mb-80{margin-bottom:-320px}.xl\:sl--ml-80{margin-left:-320px}.xl\:sl--mt-96{margin-top:-384px}.xl\:sl--mr-96{margin-right:-384px}.xl\:sl--mb-96{margin-bottom:-384px}.xl\:sl--ml-96{margin-left:-384px}.xl\:sl--mt-px{margin-top:-1px}.xl\:sl--mr-px{margin-right:-1px}.xl\:sl--mb-px{margin-bottom:-1px}.xl\:sl--ml-px{margin-left:-1px}.xl\:sl--mt-0\.5{margin-top:-2px}.xl\:sl--mr-0\.5{margin-right:-2px}.xl\:sl--mb-0\.5{margin-bottom:-2px}.xl\:sl--ml-0\.5{margin-left:-2px}.xl\:sl--mt-1\.5{margin-top:-6px}.xl\:sl--mr-1\.5{margin-right:-6px}.xl\:sl--mb-1\.5{margin-bottom:-6px}.xl\:sl--ml-1\.5{margin-left:-6px}.xl\:sl--mt-2\.5{margin-top:-10px}.xl\:sl--mr-2\.5{margin-right:-10px}.xl\:sl--mb-2\.5{margin-bottom:-10px}.xl\:sl--ml-2\.5{margin-left:-10px}.xl\:sl--mt-3\.5{margin-top:-14px}.xl\:sl--mr-3\.5{margin-right:-14px}.xl\:sl--mb-3\.5{margin-bottom:-14px}.xl\:sl--ml-3\.5{margin-left:-14px}.xl\:sl--mt-4\.5{margin-top:-18px}.xl\:sl--mr-4\.5{margin-right:-18px}.xl\:sl--mb-4\.5{margin-bottom:-18px}.xl\:sl--ml-4\.5{margin-left:-18px}.xl\:sl-max-h-full{max-height:100%}.xl\:sl-max-h-screen{max-height:100vh}.xl\:sl-max-w-none{max-width:none}.xl\:sl-max-w-full{max-width:100%}.xl\:sl-max-w-min{max-width:-moz-min-content;max-width:min-content}.xl\:sl-max-w-max{max-width:-moz-max-content;max-width:max-content}.xl\:sl-max-w-prose{max-width:65ch}.xl\:sl-min-h-full{min-height:100%}.xl\:sl-min-h-screen{min-height:100vh}.xl\:sl-min-w-full{min-width:100%}.xl\:sl-min-w-min{min-width:-moz-min-content;min-width:min-content}.xl\:sl-min-w-max{min-width:-moz-max-content;min-width:max-content}.xl\:sl-p-0{padding:0}.xl\:sl-p-1{padding:4px}.xl\:sl-p-2{padding:8px}.xl\:sl-p-3{padding:12px}.xl\:sl-p-4{padding:16px}.xl\:sl-p-5{padding:20px}.xl\:sl-p-6{padding:24px}.xl\:sl-p-7{padding:28px}.xl\:sl-p-8{padding:32px}.xl\:sl-p-9{padding:36px}.xl\:sl-p-10{padding:40px}.xl\:sl-p-11{padding:44px}.xl\:sl-p-12{padding:48px}.xl\:sl-p-14{padding:56px}.xl\:sl-p-16{padding:64px}.xl\:sl-p-20{padding:80px}.xl\:sl-p-24{padding:96px}.xl\:sl-p-28{padding:112px}.xl\:sl-p-32{padding:128px}.xl\:sl-p-36{padding:144px}.xl\:sl-p-40{padding:160px}.xl\:sl-p-44{padding:176px}.xl\:sl-p-48{padding:192px}.xl\:sl-p-52{padding:208px}.xl\:sl-p-56{padding:224px}.xl\:sl-p-60{padding:240px}.xl\:sl-p-64{padding:256px}.xl\:sl-p-72{padding:288px}.xl\:sl-p-80{padding:320px}.xl\:sl-p-96{padding:384px}.xl\:sl-p-px{padding:1px}.xl\:sl-p-0\.5{padding:2px}.xl\:sl-p-1\.5{padding:6px}.xl\:sl-p-2\.5{padding:10px}.xl\:sl-p-3\.5{padding:14px}.xl\:sl-p-4\.5{padding:18px}.xl\:sl-py-0{padding-bottom:0;padding-top:0}.xl\:sl-px-0{padding-left:0;padding-right:0}.xl\:sl-py-1{padding-bottom:4px;padding-top:4px}.xl\:sl-px-1{padding-left:4px;padding-right:4px}.xl\:sl-py-2{padding-bottom:8px;padding-top:8px}.xl\:sl-px-2{padding-left:8px;padding-right:8px}.xl\:sl-py-3{padding-bottom:12px;padding-top:12px}.xl\:sl-px-3{padding-left:12px;padding-right:12px}.xl\:sl-py-4{padding-bottom:16px;padding-top:16px}.xl\:sl-px-4{padding-left:16px;padding-right:16px}.xl\:sl-py-5{padding-bottom:20px;padding-top:20px}.xl\:sl-px-5{padding-left:20px;padding-right:20px}.xl\:sl-py-6{padding-bottom:24px;padding-top:24px}.xl\:sl-px-6{padding-left:24px;padding-right:24px}.xl\:sl-py-7{padding-bottom:28px;padding-top:28px}.xl\:sl-px-7{padding-left:28px;padding-right:28px}.xl\:sl-py-8{padding-bottom:32px;padding-top:32px}.xl\:sl-px-8{padding-left:32px;padding-right:32px}.xl\:sl-py-9{padding-bottom:36px;padding-top:36px}.xl\:sl-px-9{padding-left:36px;padding-right:36px}.xl\:sl-py-10{padding-bottom:40px;padding-top:40px}.xl\:sl-px-10{padding-left:40px;padding-right:40px}.xl\:sl-py-11{padding-bottom:44px;padding-top:44px}.xl\:sl-px-11{padding-left:44px;padding-right:44px}.xl\:sl-py-12{padding-bottom:48px;padding-top:48px}.xl\:sl-px-12{padding-left:48px;padding-right:48px}.xl\:sl-py-14{padding-bottom:56px;padding-top:56px}.xl\:sl-px-14{padding-left:56px;padding-right:56px}.xl\:sl-py-16{padding-bottom:64px;padding-top:64px}.xl\:sl-px-16{padding-left:64px;padding-right:64px}.xl\:sl-py-20{padding-bottom:80px;padding-top:80px}.xl\:sl-px-20{padding-left:80px;padding-right:80px}.xl\:sl-py-24{padding-bottom:96px;padding-top:96px}.xl\:sl-px-24{padding-left:96px;padding-right:96px}.xl\:sl-py-28{padding-bottom:112px;padding-top:112px}.xl\:sl-px-28{padding-left:112px;padding-right:112px}.xl\:sl-py-32{padding-bottom:128px;padding-top:128px}.xl\:sl-px-32{padding-left:128px;padding-right:128px}.xl\:sl-py-36{padding-bottom:144px;padding-top:144px}.xl\:sl-px-36{padding-left:144px;padding-right:144px}.xl\:sl-py-40{padding-bottom:160px;padding-top:160px}.xl\:sl-px-40{padding-left:160px;padding-right:160px}.xl\:sl-py-44{padding-bottom:176px;padding-top:176px}.xl\:sl-px-44{padding-left:176px;padding-right:176px}.xl\:sl-py-48{padding-bottom:192px;padding-top:192px}.xl\:sl-px-48{padding-left:192px;padding-right:192px}.xl\:sl-py-52{padding-bottom:208px;padding-top:208px}.xl\:sl-px-52{padding-left:208px;padding-right:208px}.xl\:sl-py-56{padding-bottom:224px;padding-top:224px}.xl\:sl-px-56{padding-left:224px;padding-right:224px}.xl\:sl-py-60{padding-bottom:240px;padding-top:240px}.xl\:sl-px-60{padding-left:240px;padding-right:240px}.xl\:sl-py-64{padding-bottom:256px;padding-top:256px}.xl\:sl-px-64{padding-left:256px;padding-right:256px}.xl\:sl-py-72{padding-bottom:288px;padding-top:288px}.xl\:sl-px-72{padding-left:288px;padding-right:288px}.xl\:sl-py-80{padding-bottom:320px;padding-top:320px}.xl\:sl-px-80{padding-left:320px;padding-right:320px}.xl\:sl-py-96{padding-bottom:384px;padding-top:384px}.xl\:sl-px-96{padding-left:384px;padding-right:384px}.xl\:sl-py-px{padding-bottom:1px;padding-top:1px}.xl\:sl-px-px{padding-left:1px;padding-right:1px}.xl\:sl-py-0\.5{padding-bottom:2px;padding-top:2px}.xl\:sl-px-0\.5{padding-left:2px;padding-right:2px}.xl\:sl-py-1\.5{padding-bottom:6px;padding-top:6px}.xl\:sl-px-1\.5{padding-left:6px;padding-right:6px}.xl\:sl-py-2\.5{padding-bottom:10px;padding-top:10px}.xl\:sl-px-2\.5{padding-left:10px;padding-right:10px}.xl\:sl-py-3\.5{padding-bottom:14px;padding-top:14px}.xl\:sl-px-3\.5{padding-left:14px;padding-right:14px}.xl\:sl-py-4\.5{padding-bottom:18px;padding-top:18px}.xl\:sl-px-4\.5{padding-left:18px;padding-right:18px}.xl\:sl-pt-0{padding-top:0}.xl\:sl-pr-0{padding-right:0}.xl\:sl-pb-0{padding-bottom:0}.xl\:sl-pl-0{padding-left:0}.xl\:sl-pt-1{padding-top:4px}.xl\:sl-pr-1{padding-right:4px}.xl\:sl-pb-1{padding-bottom:4px}.xl\:sl-pl-1{padding-left:4px}.xl\:sl-pt-2{padding-top:8px}.xl\:sl-pr-2{padding-right:8px}.xl\:sl-pb-2{padding-bottom:8px}.xl\:sl-pl-2{padding-left:8px}.xl\:sl-pt-3{padding-top:12px}.xl\:sl-pr-3{padding-right:12px}.xl\:sl-pb-3{padding-bottom:12px}.xl\:sl-pl-3{padding-left:12px}.xl\:sl-pt-4{padding-top:16px}.xl\:sl-pr-4{padding-right:16px}.xl\:sl-pb-4{padding-bottom:16px}.xl\:sl-pl-4{padding-left:16px}.xl\:sl-pt-5{padding-top:20px}.xl\:sl-pr-5{padding-right:20px}.xl\:sl-pb-5{padding-bottom:20px}.xl\:sl-pl-5{padding-left:20px}.xl\:sl-pt-6{padding-top:24px}.xl\:sl-pr-6{padding-right:24px}.xl\:sl-pb-6{padding-bottom:24px}.xl\:sl-pl-6{padding-left:24px}.xl\:sl-pt-7{padding-top:28px}.xl\:sl-pr-7{padding-right:28px}.xl\:sl-pb-7{padding-bottom:28px}.xl\:sl-pl-7{padding-left:28px}.xl\:sl-pt-8{padding-top:32px}.xl\:sl-pr-8{padding-right:32px}.xl\:sl-pb-8{padding-bottom:32px}.xl\:sl-pl-8{padding-left:32px}.xl\:sl-pt-9{padding-top:36px}.xl\:sl-pr-9{padding-right:36px}.xl\:sl-pb-9{padding-bottom:36px}.xl\:sl-pl-9{padding-left:36px}.xl\:sl-pt-10{padding-top:40px}.xl\:sl-pr-10{padding-right:40px}.xl\:sl-pb-10{padding-bottom:40px}.xl\:sl-pl-10{padding-left:40px}.xl\:sl-pt-11{padding-top:44px}.xl\:sl-pr-11{padding-right:44px}.xl\:sl-pb-11{padding-bottom:44px}.xl\:sl-pl-11{padding-left:44px}.xl\:sl-pt-12{padding-top:48px}.xl\:sl-pr-12{padding-right:48px}.xl\:sl-pb-12{padding-bottom:48px}.xl\:sl-pl-12{padding-left:48px}.xl\:sl-pt-14{padding-top:56px}.xl\:sl-pr-14{padding-right:56px}.xl\:sl-pb-14{padding-bottom:56px}.xl\:sl-pl-14{padding-left:56px}.xl\:sl-pt-16{padding-top:64px}.xl\:sl-pr-16{padding-right:64px}.xl\:sl-pb-16{padding-bottom:64px}.xl\:sl-pl-16{padding-left:64px}.xl\:sl-pt-20{padding-top:80px}.xl\:sl-pr-20{padding-right:80px}.xl\:sl-pb-20{padding-bottom:80px}.xl\:sl-pl-20{padding-left:80px}.xl\:sl-pt-24{padding-top:96px}.xl\:sl-pr-24{padding-right:96px}.xl\:sl-pb-24{padding-bottom:96px}.xl\:sl-pl-24{padding-left:96px}.xl\:sl-pt-28{padding-top:112px}.xl\:sl-pr-28{padding-right:112px}.xl\:sl-pb-28{padding-bottom:112px}.xl\:sl-pl-28{padding-left:112px}.xl\:sl-pt-32{padding-top:128px}.xl\:sl-pr-32{padding-right:128px}.xl\:sl-pb-32{padding-bottom:128px}.xl\:sl-pl-32{padding-left:128px}.xl\:sl-pt-36{padding-top:144px}.xl\:sl-pr-36{padding-right:144px}.xl\:sl-pb-36{padding-bottom:144px}.xl\:sl-pl-36{padding-left:144px}.xl\:sl-pt-40{padding-top:160px}.xl\:sl-pr-40{padding-right:160px}.xl\:sl-pb-40{padding-bottom:160px}.xl\:sl-pl-40{padding-left:160px}.xl\:sl-pt-44{padding-top:176px}.xl\:sl-pr-44{padding-right:176px}.xl\:sl-pb-44{padding-bottom:176px}.xl\:sl-pl-44{padding-left:176px}.xl\:sl-pt-48{padding-top:192px}.xl\:sl-pr-48{padding-right:192px}.xl\:sl-pb-48{padding-bottom:192px}.xl\:sl-pl-48{padding-left:192px}.xl\:sl-pt-52{padding-top:208px}.xl\:sl-pr-52{padding-right:208px}.xl\:sl-pb-52{padding-bottom:208px}.xl\:sl-pl-52{padding-left:208px}.xl\:sl-pt-56{padding-top:224px}.xl\:sl-pr-56{padding-right:224px}.xl\:sl-pb-56{padding-bottom:224px}.xl\:sl-pl-56{padding-left:224px}.xl\:sl-pt-60{padding-top:240px}.xl\:sl-pr-60{padding-right:240px}.xl\:sl-pb-60{padding-bottom:240px}.xl\:sl-pl-60{padding-left:240px}.xl\:sl-pt-64{padding-top:256px}.xl\:sl-pr-64{padding-right:256px}.xl\:sl-pb-64{padding-bottom:256px}.xl\:sl-pl-64{padding-left:256px}.xl\:sl-pt-72{padding-top:288px}.xl\:sl-pr-72{padding-right:288px}.xl\:sl-pb-72{padding-bottom:288px}.xl\:sl-pl-72{padding-left:288px}.xl\:sl-pt-80{padding-top:320px}.xl\:sl-pr-80{padding-right:320px}.xl\:sl-pb-80{padding-bottom:320px}.xl\:sl-pl-80{padding-left:320px}.xl\:sl-pt-96{padding-top:384px}.xl\:sl-pr-96{padding-right:384px}.xl\:sl-pb-96{padding-bottom:384px}.xl\:sl-pl-96{padding-left:384px}.xl\:sl-pt-px{padding-top:1px}.xl\:sl-pr-px{padding-right:1px}.xl\:sl-pb-px{padding-bottom:1px}.xl\:sl-pl-px{padding-left:1px}.xl\:sl-pt-0\.5{padding-top:2px}.xl\:sl-pr-0\.5{padding-right:2px}.xl\:sl-pb-0\.5{padding-bottom:2px}.xl\:sl-pl-0\.5{padding-left:2px}.xl\:sl-pt-1\.5{padding-top:6px}.xl\:sl-pr-1\.5{padding-right:6px}.xl\:sl-pb-1\.5{padding-bottom:6px}.xl\:sl-pl-1\.5{padding-left:6px}.xl\:sl-pt-2\.5{padding-top:10px}.xl\:sl-pr-2\.5{padding-right:10px}.xl\:sl-pb-2\.5{padding-bottom:10px}.xl\:sl-pl-2\.5{padding-left:10px}.xl\:sl-pt-3\.5{padding-top:14px}.xl\:sl-pr-3\.5{padding-right:14px}.xl\:sl-pb-3\.5{padding-bottom:14px}.xl\:sl-pl-3\.5{padding-left:14px}.xl\:sl-pt-4\.5{padding-top:18px}.xl\:sl-pr-4\.5{padding-right:18px}.xl\:sl-pb-4\.5{padding-bottom:18px}.xl\:sl-pl-4\.5{padding-left:18px}.xl\:sl-static{position:static}.xl\:sl-fixed{position:fixed}.xl\:sl-absolute{position:absolute}.xl\:sl-relative{position:relative}.xl\:sl-sticky{position:-webkit-sticky;position:sticky}.xl\:sl-visible{visibility:visible}.xl\:sl-invisible{visibility:hidden}.sl-group:hover .xl\:group-hover\:sl-visible{visibility:visible}.sl-group:hover .xl\:group-hover\:sl-invisible{visibility:hidden}.sl-group:focus .xl\:group-focus\:sl-visible{visibility:visible}.sl-group:focus .xl\:group-focus\:sl-invisible{visibility:hidden}.xl\:sl-w-0{width:0}.xl\:sl-w-1{width:4px}.xl\:sl-w-2{width:8px}.xl\:sl-w-3{width:12px}.xl\:sl-w-4{width:16px}.xl\:sl-w-5{width:20px}.xl\:sl-w-6{width:24px}.xl\:sl-w-7{width:28px}.xl\:sl-w-8{width:32px}.xl\:sl-w-9{width:36px}.xl\:sl-w-10{width:40px}.xl\:sl-w-11{width:44px}.xl\:sl-w-12{width:48px}.xl\:sl-w-14{width:56px}.xl\:sl-w-16{width:64px}.xl\:sl-w-20{width:80px}.xl\:sl-w-24{width:96px}.xl\:sl-w-28{width:112px}.xl\:sl-w-32{width:128px}.xl\:sl-w-36{width:144px}.xl\:sl-w-40{width:160px}.xl\:sl-w-44{width:176px}.xl\:sl-w-48{width:192px}.xl\:sl-w-52{width:208px}.xl\:sl-w-56{width:224px}.xl\:sl-w-60{width:240px}.xl\:sl-w-64{width:256px}.xl\:sl-w-72{width:288px}.xl\:sl-w-80{width:320px}.xl\:sl-w-96{width:384px}.xl\:sl-w-auto{width:auto}.xl\:sl-w-px{width:1px}.xl\:sl-w-0\.5{width:2px}.xl\:sl-w-1\.5{width:6px}.xl\:sl-w-2\.5{width:10px}.xl\:sl-w-3\.5{width:14px}.xl\:sl-w-4\.5{width:18px}.xl\:sl-w-xs{width:20px}.xl\:sl-w-sm{width:24px}.xl\:sl-w-md{width:32px}.xl\:sl-w-lg{width:36px}.xl\:sl-w-xl{width:44px}.xl\:sl-w-2xl{width:52px}.xl\:sl-w-3xl{width:60px}.xl\:sl-w-1\/2{width:50%}.xl\:sl-w-1\/3{width:33.333333%}.xl\:sl-w-2\/3{width:66.666667%}.xl\:sl-w-1\/4{width:25%}.xl\:sl-w-2\/4{width:50%}.xl\:sl-w-3\/4{width:75%}.xl\:sl-w-1\/5{width:20%}.xl\:sl-w-2\/5{width:40%}.xl\:sl-w-3\/5{width:60%}.xl\:sl-w-4\/5{width:80%}.xl\:sl-w-1\/6{width:16.666667%}.xl\:sl-w-2\/6{width:33.333333%}.xl\:sl-w-3\/6{width:50%}.xl\:sl-w-4\/6{width:66.666667%}.xl\:sl-w-5\/6{width:83.333333%}.xl\:sl-w-full{width:100%}.xl\:sl-w-screen{width:100vw}.xl\:sl-w-min{width:-moz-min-content;width:min-content}.xl\:sl-w-max{width:-moz-max-content;width:max-content}}:root,[data-theme=light],[data-theme=light] .sl-inverted .sl-inverted,[data-theme=light] .sl-inverted .sl-inverted .sl-inverted .sl-inverted{--text-h:0;--text-s:0%;--text-l:15%;--shadow-sm:0px 0px 1px rgba(67,90,111,.3);--shadow-md:0px 2px 4px -2px rgba(0,0,0,.25),0px 0px 1px rgba(67,90,111,.3);--shadow-lg:0 4px 17px rgba(67,90,111,.2),0 2px 3px rgba(0,0,0,.1),inset 0 0 0 .5px var(--color-canvas-pure),0 0 0 .5px rgba(0,0,0,.2);--shadow-xl:0px 0px 1px rgba(67,90,111,.3),0px 8px 10px -4px rgba(67,90,111,.45);--shadow-2xl:0px 0px 1px rgba(67,90,111,.3),0px 16px 24px -8px rgba(67,90,111,.45);--drop-shadow-default1:0 0 0.5px rgba(0,0,0,.6);--drop-shadow-default2:0 2px 5px rgba(67,90,111,.3);--color-text-heading:hsla(var(--text-h),var(--text-s),max(3,calc(var(--text-l) - 15)),1);--color-text:hsla(var(--text-h),var(--text-s),var(--text-l),1);--color-text-paragraph:hsla(var(--text-h),var(--text-s),var(--text-l),0.9);--color-text-muted:hsla(var(--text-h),var(--text-s),var(--text-l),0.7);--color-text-light:hsla(var(--text-h),var(--text-s),var(--text-l),0.55);--color-text-disabled:hsla(var(--text-h),var(--text-s),var(--text-l),0.3);--canvas-h:218;--canvas-s:40%;--canvas-l:100%;--color-canvas:hsla(var(--canvas-h),var(--canvas-s),var(--canvas-l),1);--color-canvas-dark:#2d3748;--color-canvas-pure:#fff;--color-canvas-tint:rgba(245,247,250,.5);--color-canvas-50:#f5f7fa;--color-canvas-100:#ebeef5;--color-canvas-200:#e0e6f0;--color-canvas-300:#d5ddeb;--color-canvas-400:#cbd5e7;--color-canvas-500:#c0cde2;--color-canvas-dialog:#fff;--color-border-dark:hsla(var(--canvas-h),30%,72%,0.5);--color-border:hsla(var(--canvas-h),32%,78%,0.5);--color-border-light:hsla(var(--canvas-h),24%,84%,0.5);--color-border-input:hsla(var(--canvas-h),24%,72%,0.8);--color-border-button:hsla(var(--canvas-h),24%,20%,0.65);--primary-h:202;--primary-s:100%;--primary-l:55%;--color-text-primary:#0081cc;--color-primary-dark:#1891d8;--color-primary-darker:#126fa5;--color-primary:#19abff;--color-primary-light:#52bfff;--color-primary-tint:rgba(77,190,255,.25);--color-on-primary:#fff;--success-h:156;--success-s:95%;--success-l:37%;--color-text-success:#05c779;--color-success-dark:#138b5b;--color-success-darker:#0f6c47;--color-success:#05b870;--color-success-light:#06db86;--color-success-tint:rgba(81,251,183,.25);--color-on-success:#fff;--warning-h:20;--warning-s:90%;--warning-l:56%;--color-text-warning:#c2470a;--color-warning-dark:#d35d22;--color-warning-darker:#9e461a;--color-warning:#f46d2a;--color-warning-light:#f7925f;--color-warning-tint:rgba(246,139,85,.25);--color-on-warning:#fff;--danger-h:0;--danger-s:84%;--danger-l:63%;--color-text-danger:#bc1010;--color-danger-dark:#d83b3b;--color-danger-darker:#af2323;--color-danger:#f05151;--color-danger-light:#f58e8e;--color-danger-tint:rgba(241,91,91,.25);--color-on-danger:#fff;color:var(--color-text)}:root .sl-inverted,[data-theme=light] .sl-inverted,[data-theme=light] .sl-inverted .sl-inverted .sl-inverted{--text-h:0;--text-s:0%;--text-l:86%;--shadow-sm:0px 0px 1px rgba(11,13,19,.5);--shadow-md:0px 2px 4px -2px rgba(0,0,0,.35),0px 0px 1px rgba(11,13,19,.4);--shadow-lg:0 2px 14px rgba(0,0,0,.55),0 0 0 0.5px hsla(0,0%,100%,.2);--shadow-xl:0px 0px 1px rgba(11,13,19,.4),0px 8px 10px -4px rgba(11,13,19,.55);--shadow-2xl:0px 0px 1px rgba(11,13,19,.4),0px 16px 24px -8px rgba(11,13,19,.55);--drop-shadow-default1:0 0 0.5px hsla(0,0%,100%,.5);--drop-shadow-default2:0 3px 8px rgba(0,0,0,.6);--color-text-heading:hsla(var(--text-h),var(--text-s),max(3,calc(var(--text-l) - 15)),1);--color-text:hsla(var(--text-h),var(--text-s),var(--text-l),1);--color-text-paragraph:hsla(var(--text-h),var(--text-s),var(--text-l),0.9);--color-text-muted:hsla(var(--text-h),var(--text-s),var(--text-l),0.7);--color-text-light:hsla(var(--text-h),var(--text-s),var(--text-l),0.55);--color-text-disabled:hsla(var(--text-h),var(--text-s),var(--text-l),0.3);--canvas-h:218;--canvas-s:32%;--canvas-l:10%;--color-canvas:hsla(var(--canvas-h),var(--canvas-s),var(--canvas-l),1);--color-canvas-dark:#2d3748;--color-canvas-pure:#0c1018;--color-canvas-tint:rgba(60,76,103,.2);--color-canvas-50:#3c4c67;--color-canvas-100:#2d394e;--color-canvas-200:#212a3b;--color-canvas-300:#19212e;--color-canvas-400:#171e2b;--color-canvas-500:#151c28;--color-canvas-dialog:#2d394e;--color-border-dark:hsla(var(--canvas-h),24%,23%,0.5);--color-border:hsla(var(--canvas-h),26%,28%,0.5);--color-border-light:hsla(var(--canvas-h),19%,34%,0.5);--color-border-input:hsla(var(--canvas-h),19%,30%,0.8);--color-border-button:hsla(var(--canvas-h),19%,80%,0.65);--primary-h:202;--primary-s:90%;--primary-l:51%;--color-text-primary:#66c7ff;--color-primary-dark:#1f83bd;--color-primary-darker:#186491;--color-primary:#12a0f3;--color-primary-light:#42b3f5;--color-primary-tint:rgba(85,187,246,.25);--color-on-primary:#fff;--success-h:156;--success-s:95%;--success-l:67%;--color-text-success:#41f1ab;--color-success-dark:#47dca0;--color-success-darker:#24bc7f;--color-success:#62f3b9;--color-success-light:#a0f8d5;--color-success-tint:rgba(89,243,181,.25);--color-on-success:#fff;--warning-h:20;--warning-s:90%;--warning-l:50%;--color-text-warning:#ec7d46;--color-warning-dark:#b55626;--color-warning-darker:#8b421d;--color-warning:#e75d18;--color-warning-light:#ec7d46;--color-warning-tint:rgba(238,142,93,.25);--color-on-warning:#fff;--danger-h:0;--danger-s:84%;--danger-l:43%;--color-text-danger:#e74b4b;--color-danger-dark:#972626;--color-danger-darker:#721d1d;--color-danger:#c11a1a;--color-danger-light:#e22828;--color-danger-tint:rgba(234,98,98,.25);--color-on-danger:#fff;color:var(--color-text)}[data-theme=dark],[data-theme=dark] .sl-inverted .sl-inverted,[data-theme=dark] .sl-inverted .sl-inverted .sl-inverted .sl-inverted{--text-h:0;--text-s:0%;--text-l:85%;--shadow-sm:0px 0px 1px rgba(11,13,19,.5);--shadow-md:0px 2px 4px -2px rgba(0,0,0,.35),0px 0px 1px rgba(11,13,19,.4);--shadow-lg:0 2px 14px rgba(0,0,0,.55),0 0 0 0.5px hsla(0,0%,100%,.2);--shadow-xl:0px 0px 1px rgba(11,13,19,.4),0px 8px 10px -4px rgba(11,13,19,.55);--shadow-2xl:0px 0px 1px rgba(11,13,19,.4),0px 16px 24px -8px rgba(11,13,19,.55);--drop-shadow-default1:0 0 0.5px hsla(0,0%,100%,.5);--drop-shadow-default2:0 3px 8px rgba(0,0,0,.6);--color-text-heading:hsla(var(--text-h),var(--text-s),max(3,calc(var(--text-l) - 15)),1);--color-text:hsla(var(--text-h),var(--text-s),var(--text-l),1);--color-text-paragraph:hsla(var(--text-h),var(--text-s),var(--text-l),0.9);--color-text-muted:hsla(var(--text-h),var(--text-s),var(--text-l),0.7);--color-text-light:hsla(var(--text-h),var(--text-s),var(--text-l),0.55);--color-text-disabled:hsla(var(--text-h),var(--text-s),var(--text-l),0.3);--canvas-h:218;--canvas-s:32%;--canvas-l:8%;--color-canvas:hsla(var(--canvas-h),var(--canvas-s),var(--canvas-l),1);--color-canvas-dark:#2d3748;--color-canvas-pure:#090c11;--color-canvas-tint:rgba(57,71,96,.2);--color-canvas-50:#262f40;--color-canvas-100:#1a212d;--color-canvas-200:#121821;--color-canvas-300:#0e131a;--color-canvas-400:#0c1017;--color-canvas-500:#0c1017;--color-canvas-dialog:#1a212d;--color-border-dark:hsla(var(--canvas-h),24%,21%,0.5);--color-border:hsla(var(--canvas-h),26%,26%,0.5);--color-border-light:hsla(var(--canvas-h),19%,32%,0.5);--color-border-input:hsla(var(--canvas-h),19%,28%,0.8);--color-border-button:hsla(var(--canvas-h),19%,80%,0.65);--primary-h:202;--primary-s:80%;--primary-l:36%;--color-text-primary:#66c7ff;--color-primary-dark:#1c5a7d;--color-primary-darker:#154560;--color-primary:#126fa5;--color-primary-light:#1685c5;--color-primary-tint:rgba(21,130,193,.25);--color-on-primary:#fff;--success-h:156;--success-s:95%;--success-l:37%;--color-text-success:#4be7a9;--color-success-dark:#145239;--color-success-darker:#10422e;--color-success:#0f6c47;--color-success-light:#128255;--color-success-tint:rgba(26,188,123,.25);--color-on-success:#fff;--warning-h:20;--warning-s:90%;--warning-l:56%;--color-text-warning:#e28150;--color-warning-dark:#7d4021;--color-warning-darker:#61311a;--color-warning:#9e461a;--color-warning-light:#c1551f;--color-warning-tint:rgba(184,81,30,.25);--color-on-warning:#fff;--danger-h:0;--danger-s:84%;--danger-l:63%;--color-text-danger:#d55;--color-danger-dark:#892929;--color-danger-darker:#6a2020;--color-danger:#af2323;--color-danger-light:#d12929;--color-danger-tint:rgba(179,35,35,.25);--color-on-danger:#fff;color:var(--color-text)}[data-theme=dark] .sl-inverted,[data-theme=dark] .sl-inverted .sl-inverted .sl-inverted{--text-h:0;--text-s:0%;--text-l:89%;--shadow-sm:0px 0px 1px rgba(11,13,19,.5);--shadow-md:0px 2px 4px -2px rgba(0,0,0,.35),0px 0px 1px rgba(11,13,19,.4);--shadow-lg:0 2px 14px rgba(0,0,0,.55),0 0 0 0.5px hsla(0,0%,100%,.2);--shadow-xl:0px 0px 1px rgba(11,13,19,.4),0px 8px 10px -4px rgba(11,13,19,.55);--shadow-2xl:0px 0px 1px rgba(11,13,19,.4),0px 16px 24px -8px rgba(11,13,19,.55);--drop-shadow-default1:0 0 0.5px hsla(0,0%,100%,.5);--drop-shadow-default2:0 3px 8px rgba(0,0,0,.6);--color-text-heading:hsla(var(--text-h),var(--text-s),max(3,calc(var(--text-l) - 15)),1);--color-text:hsla(var(--text-h),var(--text-s),var(--text-l),1);--color-text-paragraph:hsla(var(--text-h),var(--text-s),var(--text-l),0.9);--color-text-muted:hsla(var(--text-h),var(--text-s),var(--text-l),0.7);--color-text-light:hsla(var(--text-h),var(--text-s),var(--text-l),0.55);--color-text-disabled:hsla(var(--text-h),var(--text-s),var(--text-l),0.3);--canvas-h:218;--canvas-s:32%;--canvas-l:13%;--color-canvas:hsla(var(--canvas-h),var(--canvas-s),var(--canvas-l),1);--color-canvas-dark:#2d3748;--color-canvas-pure:#111722;--color-canvas-tint:rgba(66,83,112,.2);--color-canvas-50:#2b374a;--color-canvas-100:#222b3a;--color-canvas-200:#1a212e;--color-canvas-300:#141a24;--color-canvas-400:#121721;--color-canvas-500:#121721;--color-canvas-dialog:#222b3a;--color-border-dark:hsla(var(--canvas-h),24%,26%,0.5);--color-border:hsla(var(--canvas-h),26%,31%,0.5);--color-border-light:hsla(var(--canvas-h),19%,37%,0.5);--color-border-input:hsla(var(--canvas-h),19%,33%,0.8);--color-border-button:hsla(var(--canvas-h),19%,80%,0.65);--primary-h:202;--primary-s:80%;--primary-l:33%;--color-text-primary:#66c7ff;--color-primary-dark:#1a5475;--color-primary-darker:#14425c;--color-primary:#116697;--color-primary-light:#147cb8;--color-primary-tint:rgba(21,130,193,.25);--color-on-primary:#fff;--success-h:156;--success-s:95%;--success-l:67%;--color-text-success:#4be7a9;--color-success-dark:#25986a;--color-success-darker:#1c7350;--color-success:#1bc581;--color-success-light:#28e297;--color-success-tint:rgba(26,188,123,.25);--color-on-success:#fff;--warning-h:20;--warning-s:90%;--warning-l:50%;--color-text-warning:#e28150;--color-warning-dark:#713a1e;--color-warning-darker:#552b16;--color-warning:#914018;--color-warning-light:#ab4c1c;--color-warning-tint:rgba(184,81,30,.25);--color-on-warning:#fff;--danger-h:0;--danger-s:84%;--danger-l:43%;--color-text-danger:#d55;--color-danger-dark:#5e1c1c;--color-danger-darker:#471515;--color-danger:#771818;--color-danger-light:#911d1d;--color-danger-tint:rgba(179,35,35,.25);--color-on-danger:#fff;color:var(--color-text)}.sl-elements{font-size:13px}.sl-elements .svg-inline--fa{display:inline-block}.sl-elements .DocsSkeleton{animation:skeleton-glow .5s linear infinite alternate;background:rgba(206,217,224,.2);background-clip:padding-box!important;border-color:rgba(206,217,224,.2)!important;border-radius:2px;box-shadow:none!important;color:transparent!important;cursor:default;pointer-events:none;user-select:none}.sl-elements .Model{--fs-code:12px}.sl-elements .ElementsTableOfContentsItem:hover{color:inherit;text-decoration:none}.sl-elements .ParameterGrid{align-items:center;display:grid;grid-template-columns:fit-content(120px) 20px auto;margin-bottom:16px;padding-bottom:0;row-gap:3px}.sl-elements .TryItPanel>:nth-child(2){overflow:auto}.sl-elements .OperationParametersContent{max-height:162px}.sl-elements .Checkbox{max-width:15px;padding-right:3px}.sl-elements .TextForCheckBox{padding-left:9px;padding-top:6px}.sl-elements .TextRequestBody{margin-bottom:16px;max-height:200px;overflow-y:auto;padding-bottom:0}.sl-elements .HttpOperation .JsonSchemaViewer .sl-markdown-viewer p,.sl-elements .HttpOperation__Parameters .sl-markdown-viewer p,.sl-elements .Model .JsonSchemaViewer .sl-markdown-viewer p{font-size:12px;line-height:1.5em}.sl-elements .HttpOperation div[role=tablist]{overflow-x:auto}.sl-elements .HttpService .ServerInfo .sl-panel__titlebar div{height:100%;min-height:36px} diff --git a/website/static/css/stoplight-custom.css b/website/static/css/stoplight-custom.css new file mode 100644 index 00000000000..81467c5ef35 --- /dev/null +++ b/website/static/css/stoplight-custom.css @@ -0,0 +1,38 @@ +[data-theme="dark"] { + --stoplight-pre-background: #ebedf0; + --stoplight-pre-color: #333333; +} + +[data-theme="light"] { + --stoplight-pre-background: var(--ifm-pre-background); + --stoplight-pre-color: var(--ifm-pre-color); +} + +.sl-font-ui, .sl-font-prose, .sl-prose, .sl-button { + /* Ensure we use the same font in stoplight docs as in the rest of the docs site */ + font-family: var(--ifm-font-family-base) !important; +} + +.sl-text-base { + /* Bump font size to make up for slightly smaller font family (above) */ + font-size: 13px !important; +} + +.sl-font-ui { + /* Bump font size to make up for slightly smaller font family (above) */ + font-size: 14px !important; + + & a { + /* Ensure link colors aren't overriden by docusaurus css */ + color: var(--color-text) !important; + } +} + +/* Ensure codeblocks are legible when using darkmode */ +.sl-panel__content pre { + background-color: var(--stoplight-pre-background) !important; + + & .plain, .sl-code-highlight__ln { + color: var(--stoplight-pre-color) !important; + } +} diff --git a/website/static/img/Filtering.png b/website/static/img/Filtering.png index 5a15a59f23e..b05394bd459 100644 Binary files a/website/static/img/Filtering.png and b/website/static/img/Filtering.png differ diff --git a/website/static/img/Paginate.png b/website/static/img/Paginate.png index 84a15732c12..21e2fd138b8 100644 Binary files a/website/static/img/Paginate.png and b/website/static/img/Paginate.png differ diff --git a/website/static/img/api-access-profile.jpg b/website/static/img/api-access-profile.jpg new file mode 100644 index 00000000000..36ffd4beda8 Binary files /dev/null and b/website/static/img/api-access-profile.jpg differ diff --git a/website/static/img/api-access-profile.png b/website/static/img/api-access-profile.png deleted file mode 100644 index deade9f2135..00000000000 Binary files a/website/static/img/api-access-profile.png and /dev/null differ diff --git a/website/static/img/blog/2023-10-31-to-defer-or-to-clone/preview.png b/website/static/img/blog/2023-10-31-to-defer-or-to-clone/preview.png new file mode 100644 index 00000000000..4b8047a7ac5 Binary files /dev/null and b/website/static/img/blog/2023-10-31-to-defer-or-to-clone/preview.png differ diff --git a/website/static/img/blog/authors/kshitij-aranke.jpg b/website/static/img/blog/authors/kshitij-aranke.jpg new file mode 100644 index 00000000000..dd9da483972 Binary files /dev/null and b/website/static/img/blog/authors/kshitij-aranke.jpg differ diff --git a/website/static/img/dbt-cloud-project-setup-flow-next.png b/website/static/img/dbt-cloud-project-setup-flow-next.png index 660e8ae446a..92f46bccd0a 100644 Binary files a/website/static/img/dbt-cloud-project-setup-flow-next.png and b/website/static/img/dbt-cloud-project-setup-flow-next.png differ diff --git a/website/static/img/delete_projects_from_dbt_cloud_20221023.gif b/website/static/img/delete_projects_from_dbt_cloud_20221023.gif index 246c912c55b..b579556d457 100644 Binary files a/website/static/img/delete_projects_from_dbt_cloud_20221023.gif and b/website/static/img/delete_projects_from_dbt_cloud_20221023.gif differ diff --git a/website/static/img/docs/collaborate/dbt-explorer/cross-project-lineage-child.png b/website/static/img/docs/collaborate/dbt-explorer/cross-project-lineage-child.png new file mode 100644 index 00000000000..666db3384fa Binary files /dev/null and b/website/static/img/docs/collaborate/dbt-explorer/cross-project-lineage-child.png differ diff --git a/website/static/img/docs/collaborate/dbt-explorer/cross-project-lineage-parent.png b/website/static/img/docs/collaborate/dbt-explorer/cross-project-lineage-parent.png new file mode 100644 index 00000000000..ee5d19de369 Binary files /dev/null and b/website/static/img/docs/collaborate/dbt-explorer/cross-project-lineage-parent.png differ diff --git a/website/static/img/docs/dbt-cloud/cloud-ide/ide-command-bar.jpg b/website/static/img/docs/dbt-cloud/cloud-ide/ide-command-bar.jpg new file mode 100644 index 00000000000..fe60ddd7e03 Binary files /dev/null and b/website/static/img/docs/dbt-cloud/cloud-ide/ide-command-bar.jpg differ diff --git a/website/static/img/docs/dbt-cloud/dbt-cloud-enterprise/azure/azure-redirect-uri.png b/website/static/img/docs/dbt-cloud/dbt-cloud-enterprise/azure/azure-redirect-uri.png index 7daaab4504d..3bb04467abd 100644 Binary files a/website/static/img/docs/dbt-cloud/dbt-cloud-enterprise/azure/azure-redirect-uri.png and b/website/static/img/docs/dbt-cloud/dbt-cloud-enterprise/azure/azure-redirect-uri.png differ diff --git a/website/static/img/docs/dbt-cloud/dbt-cloud-enterprise/enterprise-permission-sets-diagram.png b/website/static/img/docs/dbt-cloud/dbt-cloud-enterprise/enterprise-permission-sets-diagram.png deleted file mode 100644 index e8a80f29266..00000000000 Binary files a/website/static/img/docs/dbt-cloud/dbt-cloud-enterprise/enterprise-permission-sets-diagram.png and /dev/null differ diff --git a/website/static/img/docs/dbt-cloud/defer-toggle.jpg b/website/static/img/docs/dbt-cloud/defer-toggle.jpg new file mode 100644 index 00000000000..7bd5a1c1283 Binary files /dev/null and b/website/static/img/docs/dbt-cloud/defer-toggle.jpg differ diff --git a/website/static/img/docs/dbt-cloud/semantic-layer/semantic_foundation.jpg b/website/static/img/docs/dbt-cloud/semantic-layer/semantic_foundation.jpg new file mode 100644 index 00000000000..51b4a1752eb Binary files /dev/null and b/website/static/img/docs/dbt-cloud/semantic-layer/semantic_foundation.jpg differ diff --git a/website/static/img/docs/dbt-cloud/using-dbt-cloud/dbt-cloud-enterprise/DBX-auth/dbt-databricks-oauth-user.png b/website/static/img/docs/dbt-cloud/using-dbt-cloud/dbt-cloud-enterprise/DBX-auth/dbt-databricks-oauth-user.png new file mode 100644 index 00000000000..aecf99d726a Binary files /dev/null and b/website/static/img/docs/dbt-cloud/using-dbt-cloud/dbt-cloud-enterprise/DBX-auth/dbt-databricks-oauth-user.png differ diff --git a/website/static/img/docs/dbt-cloud/using-dbt-cloud/dbt-cloud-enterprise/DBX-auth/dbt-databricks-oauth.png b/website/static/img/docs/dbt-cloud/using-dbt-cloud/dbt-cloud-enterprise/DBX-auth/dbt-databricks-oauth.png new file mode 100644 index 00000000000..bb32fab2afb Binary files /dev/null and b/website/static/img/docs/dbt-cloud/using-dbt-cloud/dbt-cloud-enterprise/DBX-auth/dbt-databricks-oauth.png differ diff --git a/website/static/img/docs/deploy/native-retry.gif b/website/static/img/docs/deploy/native-retry.gif new file mode 100644 index 00000000000..020a9958fc5 Binary files /dev/null and b/website/static/img/docs/deploy/native-retry.gif differ diff --git a/website/static/img/icons/delphi.svg b/website/static/img/icons/delphi.svg new file mode 100644 index 00000000000..7ac5c49571e --- /dev/null +++ b/website/static/img/icons/delphi.svg @@ -0,0 +1 @@ + diff --git a/website/static/img/icons/google-sheets-logo-icon.svg b/website/static/img/icons/google-sheets-logo-icon.svg new file mode 100644 index 00000000000..d080c1dd53d --- /dev/null +++ b/website/static/img/icons/google-sheets-logo-icon.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/website/static/img/icons/hex.svg b/website/static/img/icons/hex.svg new file mode 100755 index 00000000000..00431ffe299 --- /dev/null +++ b/website/static/img/icons/hex.svg @@ -0,0 +1,5 @@ + + + + + diff --git a/website/static/img/icons/klipfolio.svg b/website/static/img/icons/klipfolio.svg new file mode 100644 index 00000000000..bdd583bd3a1 --- /dev/null +++ b/website/static/img/icons/klipfolio.svg @@ -0,0 +1,9 @@ + + + + + + klipfolio-badge + + + \ No newline at end of file diff --git a/website/static/img/icons/lightdash.svg b/website/static/img/icons/lightdash.svg new file mode 100644 index 00000000000..96f4676e7ee --- /dev/null +++ b/website/static/img/icons/lightdash.svg @@ -0,0 +1,96 @@ + + + + + + + + \ No newline at end of file diff --git a/website/static/img/icons/materialize.svg b/website/static/img/icons/materialize.svg new file mode 100644 index 00000000000..92f693cd94f --- /dev/null +++ b/website/static/img/icons/materialize.svg @@ -0,0 +1,20 @@ + + + + + + + + + + + + + + diff --git a/website/static/img/icons/mode.svg b/website/static/img/icons/mode.svg new file mode 100644 index 00000000000..269c182cd8b --- /dev/null +++ b/website/static/img/icons/mode.svg @@ -0,0 +1,165 @@ + + + + + + + + + + + + \ No newline at end of file diff --git a/website/static/img/icons/oracle.svg b/website/static/img/icons/oracle.svg new file mode 100644 index 00000000000..6868dea2eb3 --- /dev/null +++ b/website/static/img/icons/oracle.svg @@ -0,0 +1,47 @@ + + + + + \ No newline at end of file diff --git a/website/static/img/icons/push.svg b/website/static/img/icons/push.svg new file mode 100644 index 00000000000..8693b207f58 --- /dev/null +++ b/website/static/img/icons/push.svg @@ -0,0 +1,49 @@ + +Created with Fabric.js 3.5.0 + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/website/static/img/icons/tableau-software.svg b/website/static/img/icons/tableau-software.svg new file mode 100644 index 00000000000..28996f1dadd --- /dev/null +++ b/website/static/img/icons/tableau-software.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/website/static/img/icons/white/delphi.svg b/website/static/img/icons/white/delphi.svg new file mode 100644 index 00000000000..7ac5c49571e --- /dev/null +++ b/website/static/img/icons/white/delphi.svg @@ -0,0 +1 @@ + diff --git a/website/static/img/icons/white/google-sheets-logo-icon.svg b/website/static/img/icons/white/google-sheets-logo-icon.svg new file mode 100644 index 00000000000..d080c1dd53d --- /dev/null +++ b/website/static/img/icons/white/google-sheets-logo-icon.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/website/static/img/icons/white/hex.svg b/website/static/img/icons/white/hex.svg new file mode 100644 index 00000000000..00431ffe299 --- /dev/null +++ b/website/static/img/icons/white/hex.svg @@ -0,0 +1,5 @@ + + + + + diff --git a/website/static/img/icons/white/klipfolio.svg b/website/static/img/icons/white/klipfolio.svg new file mode 100644 index 00000000000..bdd583bd3a1 --- /dev/null +++ b/website/static/img/icons/white/klipfolio.svg @@ -0,0 +1,9 @@ + + + + + + klipfolio-badge + + + \ No newline at end of file diff --git a/website/static/img/icons/white/lightdash.svg b/website/static/img/icons/white/lightdash.svg new file mode 100644 index 00000000000..96f4676e7ee --- /dev/null +++ b/website/static/img/icons/white/lightdash.svg @@ -0,0 +1,96 @@ + + + + + + + + \ No newline at end of file diff --git a/website/static/img/icons/white/materialize.svg b/website/static/img/icons/white/materialize.svg new file mode 100644 index 00000000000..92f693cd94f --- /dev/null +++ b/website/static/img/icons/white/materialize.svg @@ -0,0 +1,20 @@ + + + + + + + + + + + + + + diff --git a/website/static/img/icons/white/mode.svg b/website/static/img/icons/white/mode.svg new file mode 100644 index 00000000000..269c182cd8b --- /dev/null +++ b/website/static/img/icons/white/mode.svg @@ -0,0 +1,165 @@ + + + + + + + + + + + + \ No newline at end of file diff --git a/website/static/img/icons/white/oracle.svg b/website/static/img/icons/white/oracle.svg new file mode 100644 index 00000000000..6868dea2eb3 --- /dev/null +++ b/website/static/img/icons/white/oracle.svg @@ -0,0 +1,47 @@ + + + + + \ No newline at end of file diff --git a/website/static/img/icons/white/push.svg b/website/static/img/icons/white/push.svg new file mode 100644 index 00000000000..05cd660607e --- /dev/null +++ b/website/static/img/icons/white/push.svg @@ -0,0 +1,49 @@ + +Created with Fabric.js 3.5.0 + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/website/static/img/icons/white/tableau-software.svg b/website/static/img/icons/white/tableau-software.svg new file mode 100644 index 00000000000..28996f1dadd --- /dev/null +++ b/website/static/img/icons/white/tableau-software.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/website/static/img/node_color_example.png b/website/static/img/node_color_example.png index 83b26f5735a..a1a62742ca0 100644 Binary files a/website/static/img/node_color_example.png and b/website/static/img/node_color_example.png differ diff --git a/website/static/img/sample_email_data.png b/website/static/img/sample_email_data.png deleted file mode 100644 index 7224d42e60b..00000000000 Binary files a/website/static/img/sample_email_data.png and /dev/null differ diff --git a/website/vercel.json b/website/vercel.json index c5fb0638fba..81d955e2d3d 100644 --- a/website/vercel.json +++ b/website/vercel.json @@ -2,6 +2,141 @@ "cleanUrls": true, "trailingSlash": false, "redirects": [ + { + "source": "/guides/migration/versions", + "destination": "/docs/dbt-versions/core-upgrade", + "permanent": true + }, + { + "source": "/guides/migration/versions/upgrading-to-v1.7", + "destination": "/docs/dbt-versions/core-upgrade/upgrading-to-v1.7", + "permanent": true + }, + { + "source": "/guides/migration/versions/upgrading-to-v1.6", + "destination": "/docs/dbt-versions/core-upgrade/upgrading-to-v1.6", + "permanent": true + }, + { + "source": "/guides/migration/versions/upgrading-to-v1.5", + "destination": "/docs/dbt-versions/core-upgrade/upgrading-to-v1.5", + "permanent": true + }, + { + "source": "/guides/migration/versions/upgrading-to-v1.4", + "destination": "/docs/dbt-versions/core-upgrade/upgrading-to-v1.4", + "permanent": true + }, + { + "source": "/guides/migration/versions/upgrading-to-v1.3", + "destination": "/docs/dbt-versions/core-upgrade/upgrading-to-v1.3", + "permanent": true + }, + { + "source": "/guides/migration/versions/upgrading-to-v1.2", + "destination": "/docs/dbt-versions/core-upgrade/upgrading-to-v1.2", + "permanent": true + }, + { + "source": "/guides/migration/versions/upgrading-to-v1.1", + "destination": "/docs/dbt-versions/core-upgrade/upgrading-to-v1.1", + "permanent": true + }, + { + "source": "/guides/migration/versions/upgrading-to-v1.0", + "destination": "/docs/dbt-versions/core-upgrade/upgrading-to-v1.0", + "permanent": true + }, + { + "source": "/guides/migration/versions/upgrading-to-v0.21", + "destination": "/docs/dbt-versions/core-upgrade/upgrading-to-v0.21", + "permanent": true + }, + { + "source": "/guides/migration/versions/upgrading-to-v0.20", + "destination": "/docs/dbt-versions/core-upgrade/upgrading-to-v0.20", + "permanent": true + }, + { + "source": "/guides/migration/versions/Older%20versions/upgrading-to-0-11-0", + "destination": "/docs/dbt-versions/core-upgrade/Older%20versions/upgrading-to-0-11-0", + "permanent": true + }, + { + "source": "/guides/migration/versions/Older%20versions/upgrading-to-0-12-0", + "destination": "/docs/dbt-versions/core-upgrade/Older%20versions/upgrading-to-0-12-0", + "permanent": true + }, + { + "source": "/guides/migration/versions/Older%20versions/upgrading-to-0-13-0", + "destination": "/docs/dbt-versions/core-upgrade/Older%20versions/upgrading-to-0-13-0", + "permanent": true + }, + { + "source": "/guides/migration/versions/Older%20versions/upgrading-to-0-14-0", + "destination": "/docs/dbt-versions/core-upgrade/Older%20versions/upgrading-to-0-14-0", + "permanent": true + }, + { + "source": "/guides/migration/versions/Older%20versions/upgrading-to-0-14-1", + "destination": "/docs/dbt-versions/core-upgrade/Older%20versions/upgrading-to-0-14-1", + "permanent": true + }, + { + "source": "/guides/migration/versions/Older%20versions/upgrading-to-0-15-0", + "destination": "/docs/dbt-versions/core-upgrade/Older%20versions/upgrading-to-0-15-0", + "permanent": true + }, + { + "source": "/guides/migration/versions/Older%20versions/upgrading-to-0-16-0", + "destination": "/docs/dbt-versions/core-upgrade/Older%20versions/upgrading-to-0-16-0", + "permanent": true + }, + { + "source": "/guides/migration/versions/Older%20versions/upgrading-to-0-17-0", + "destination": "/docs/dbt-versions/core-upgrade/Older%20versions/upgrading-to-0-17-0", + "permanent": true + }, + { + "source": "/guides/migration/versions/Older%20versions/upgrading-to-0-18-0", + "destination":"/docs/dbt-versions/core-upgrade/Older%20versions/upgrading-to-0-18-0" , + "permanent": true + }, + { + "source": "/guides/migration/versions/Older%20versions/upgrading-to-0-19-0", + "destination": "/docs/dbt-versions/core-upgrade/Older%20versions/upgrading-to-0-19-0", + "permanent": true + }, + { + "source": "/reference/snowflake-permissions", + "destination": "/reference/database-permissions/snowflake-permissions", + "permanent": true + }, + { + "source": "/docs/build/metricflow-cli", + "destination": "/docs/build/metricflow-commands", + "permanent": true + }, + { + "source": "/docs/core/about-the-cli", + "destination": "/docs/core/about-dbt-core", + "permanent": true + }, + { + "source": "/docs/cloud/about-cloud/about-cloud-ide", + "destination": "/docs/cloud/about-cloud-develop", + "permanent": true + }, + { + "source": "/faqs/models/reference-models-in-another-project", + "destination": "/docs/collaborate/govern/project-dependencies", + "permanent": true + }, + { + "source": "/faqs/Models/reference-models-in-another-project", + "destination": "/docs/collaborate/govern/project-dependencies", + "permanent": true + }, { "source": "/docs/deploy/job-triggers", "destination": "/docs/deploy/deploy-jobs", @@ -3991,6 +4126,11 @@ "source": "/docs/dbt-cloud/on-premises/upgrading-kots", "destination": "/docs/deploy/single-tenant", "permanent": true + }, + { + "source": "/reference/resource-properties/access", + "destination": "/reference/resource-configs/access", + "permanent": true } ] }