Skip to content

Commit 7c53369

Browse files
committed
update slack links
1 parent e0ae16d commit 7c53369

6 files changed

Lines changed: 17 additions & 17 deletions

File tree

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ Follow the [tutorial](https://sqlmesh.readthedocs.io/en/stable/quick_start/) to
1414
## Join our community
1515
We'd love to join you on your data journey. Connect with us in the following ways:
1616

17-
* Join the [Tobiko Slack community](https://join.slack.com/t/tobiko-data/shared_invite/zt-1ma66d79v-a4dbf4DUpLAQJ8ptQrJygg) to ask questions, or just to say hi!
17+
* Join the [Tobiko Slack community](https://join.slack.com/t/tobiko-data/shared_invite/zt-1tofr385z-vi~hDISNABiYIgkfGM3Khg) to ask questions, or just to say hi!
1818
* File an issue on our [GitHub](https://github.com/TobikoData/sqlmesh/issues/new).
1919
* Send us an email at [hello@tobikodata.com](hello@tobikodata.com) with your questions or feedback.
2020

docs/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -72,4 +72,4 @@ SQLMesh was built on three core principles:
7272
## Next steps
7373
* [Jump right in with the quickstart](quick_start.md)
7474
* [Learn more about SQLMesh concepts](concepts/overview.md)
75-
* [Join our Slack community](https://join.slack.com/t/tobiko-data/shared_invite/zt-1ma66d79v-a4dbf4DUpLAQJ8ptQrJygg)
75+
* [Join our Slack community](https://join.slack.com/t/tobiko-data/shared_invite/zt-1tofr385z-vi~hDISNABiYIgkfGM3Khg)

docs/integrations/engines.md

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@
2525
## BigQuery - Airflow Scheduler
2626
**Engine Name:** `bigquery`
2727

28-
In order to share a common implementation across local and Airflow, SQLMesh BigQuery implements its own hook and operator.
28+
In order to share a common implementation across local and Airflow, SQLMesh BigQuery implements its own hook and operator.
2929

3030
To enable support for this operator, the Airflow BigQuery provider package should be installed on the target Airflow cluster along with SQLMesh with the BigQuery extra:
3131
```
@@ -47,21 +47,21 @@ sqlmesh_airflow = SQLMeshAirflow(
4747

4848
# Databricks
4949
## Databricks - Local/Built-in Scheduler
50-
If your project contains Python models that use PySpark DataFrames AND you are using the built-in scheduler, then you must run plan/apply on a Databricks cluster.
50+
If your project contains Python models that use PySpark DataFrames AND you are using the built-in scheduler, then you must run plan/apply on a Databricks cluster.
5151
This can be done using the [Notebook magic](../reference/notebook.md) or by using the [CLI](../reference/cli.md).
52-
This is something we are looking into improving — please leave us feedback in [our Slack channel](https://join.slack.com/t/tobiko-data/shared_invite/zt-1ma66d79v-a4dbf4DUpLAQJ8ptQrJygg) if this impacts you.
52+
This is something we are looking into improving — please leave us feedback in [our Slack channel](https://join.slack.com/t/tobiko-data/shared_invite/zt-1tofr385z-vi~hDISNABiYIgkfGM3Khg) if this impacts you.
5353
A potential workaround until this support is added is to use [Databricks Connect](https://docs.databricks.com/dev-tools/databricks-connect.html). This will make it look like you are running on a cluster, and should theoretically work.
5454

5555
Databricks has a few options for connection types to choose from:
5656
### Type: databricks (Recommended)
57-
This type will automatically detect if you are running in an environment that already has a SparkSession defined.
58-
If it detects a SparkSession, then it assumes this is a Databricks SparkSession and uses that.
57+
This type will automatically detect if you are running in an environment that already has a SparkSession defined.
58+
If it detects a SparkSession, then it assumes this is a Databricks SparkSession and uses that.
5959
If it doesn't detect a SparkSession, then it will use the connection configuration to connect to Databricks over
60-
the [Databricks SQL Connector](https://docs.databricks.com/dev-tools/python-sql-connector.html).
60+
the [Databricks SQL Connector](https://docs.databricks.com/dev-tools/python-sql-connector.html).
6161
See [databricks_sql configuration](#type--databrickssql) for the connection configuration.
6262

6363
### Type: databricks_spark_session
64-
This connection type assumes that wherever you are running you have access to a Databricks SparkSession.
64+
This connection type assumes that wherever you are running you have access to a Databricks SparkSession.
6565
This will simplify the required configuration to run since you will not need to provide connection configuration.
6666

6767
### Type: databricks_sql
@@ -86,7 +86,7 @@ Databricks has multiple operators to help differentiate running a SQL query vs.
8686

8787
When evaluating models, the SQLMesh Databricks integration implements the [DatabricksSubmitRunOperator](https://airflow.apache.org/docs/apache-airflow-providers-databricks/1.0.0/operators.html). This is needed to be able to run either SQL or Python scripts on the Databricks cluster.
8888

89-
When performing environment management operations, the SQLMesh Databricks integration is similar to the [DatabricksSqlOperator](https://airflow.apache.org/docs/apache-airflow-providers-databricks/stable/operators/sql.html#databrickssqloperator), and relies on the same [DatabricksSqlHook](https://airflow.apache.org/docs/apache-airflow-providers-databricks/stable/_api/airflow/providers/databricks/hooks/databricks_sql/index.html#airflow.providers.databricks.hooks.databricks_sql.DatabricksSqlHook) implementation.
89+
When performing environment management operations, the SQLMesh Databricks integration is similar to the [DatabricksSqlOperator](https://airflow.apache.org/docs/apache-airflow-providers-databricks/stable/operators/sql.html#databrickssqloperator), and relies on the same [DatabricksSqlHook](https://airflow.apache.org/docs/apache-airflow-providers-databricks/stable/_api/airflow/providers/databricks/hooks/databricks_sql/index.html#airflow.providers.databricks.hooks.databricks_sql.DatabricksSqlHook) implementation.
9090
All environment management operations are SQL-based, and the overhead of submitting jobs can be avoided.
9191

9292
### Engine: `databricks-submit`
@@ -95,7 +95,7 @@ Whether evaluating models or performing environment management operations, the S
9595

9696
### Engine: `databricks-sql`
9797

98-
Forces the SQLMesh Databricks integration to use the operator based on the [DatabricksSqlOperator](https://airflow.apache.org/docs/apache-airflow-providers-databricks/stable/operators/sql.html#databrickssqloperator) for all operations. If your project is pure SQL operations, then this is an option.
98+
Forces the SQLMesh Databricks integration to use the operator based on the [DatabricksSqlOperator](https://airflow.apache.org/docs/apache-airflow-providers-databricks/stable/operators/sql.html#databrickssqloperator) for all operations. If your project is pure SQL operations, then this is an option.
9999

100100
To enable support for this operator, the Airflow Databricks provider package should be installed on the target Airflow cluster along with the SQLMesh package with databricks extra as follows:
101101
```
@@ -146,7 +146,7 @@ sqlmesh_airflow = SQLMeshAirflow(
146146
| `database` | The optional database name. If not specified, the in-memory database is used | string | N |
147147

148148
## DuckDB - Airflow
149-
DuckDB only works when running locally; therefore it does not support Airflow.
149+
DuckDB only works when running locally; therefore it does not support Airflow.
150150

151151
# Postgres
152152
## Postgres - Local/Built-in Scheduler
@@ -212,7 +212,7 @@ sqlmesh_airflow = SQLMeshAirflow(
212212
## Redshift - Airflow Scheduler
213213
**Engine Name:** `redshift`
214214

215-
In order to share a common implementation across local and Airflow, SQLMesh Bigquery implements its own hook and operator.
215+
In order to share a common implementation across local and Airflow, SQLMesh Bigquery implements its own hook and operator.
216216

217217
To enable support for this operator, the Airflow BigQuery provider package should be installed on the target Airflow cluster along with SQLMesh with the Redshift extra:
218218
```

docs/integrations/github.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,8 +2,8 @@
22

33
SQLMesh's Github Actions integration will allow you to add a SQLMesh CI/CD bot to any Github project using [Github Actions](https://github.com/features/actions). The bot will automatically run [plan/apply](../concepts/plans.md) to an [environment](../concepts/environments.md) based on the code in a pull request.
44

5-
This will be done without copying or rebuilding data using SQLMesh's [Virtual Data Environments](../concepts/glossary.md#virtual-environments).
5+
This will be done without copying or rebuilding data using SQLMesh's [Virtual Data Environments](../concepts/glossary.md#virtual-environments).
66
Once approved, the CI/CD bot will automatically run [plan/apply](../concepts/plans.md) to the production environment and merge the PR upon completion.
77
This allows you to always have your main branch and prod environments in sync.
88

9-
We will be launching this CI/CD bot soon — in the meantime, please leave any feedback or questions in [our Slack channel](https://join.slack.com/t/tobiko-data/shared_invite/zt-1ma66d79v-a4dbf4DUpLAQJ8ptQrJygg)!
9+
We will be launching this CI/CD bot soon — in the meantime, please leave any feedback or questions in [our Slack channel](https://join.slack.com/t/tobiko-data/shared_invite/zt-1tofr385z-vi~hDISNABiYIgkfGM3Khg)!

docs/quick_start.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -252,4 +252,4 @@ Double-check that the data did indeed update in prod by running `sqlmesh fetchdf
252252
Congratulations, you've now conquered the basics of using SQLMesh!
253253
254254
* [Learn more about SQLMesh concepts](concepts/overview.md)
255-
* [Join our Slack community](https://join.slack.com/t/tobiko-data/shared_invite/zt-1ma66d79v-a4dbf4DUpLAQJ8ptQrJygg)
255+
* [Join our Slack community](https://join.slack.com/t/tobiko-data/shared_invite/zt-1tofr385z-vi~hDISNABiYIgkfGM3Khg)

posts/virtual_data_environments.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -158,4 +158,4 @@ With **Virtual Data Environments**, SQLMesh is able to provide fully **isolated*
158158

159159
To streamline deploying changes to production, our team is about to release the SQLMesh [CI/CD bot](https://github.com/TobikoData/sqlmesh/blob/main/docs/integrations/github.md), which will help automate this process.
160160

161-
Don't miss out - join our [Slack channel](https://join.slack.com/t/tobiko-data/shared_invite/zt-1ma66d79v-a4dbf4DUpLAQJ8ptQrJygg) and stay tuned!
161+
Don't miss out - join our [Slack channel](https://join.slack.com/t/tobiko-data/shared_invite/zt-1tofr385z-vi~hDISNABiYIgkfGM3Khg) and stay tuned!

0 commit comments

Comments
 (0)