You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
| TYPE | Yes | Type of the catalog: 'HIVE' for Hive catalog or 'ICEBERG' for Iceberg catalog. |
60
-
| METASTORE_ADDRESS | No | Hive Metastore address. Required for Hive catalog only.|
61
-
| URL | Yes | Location of the external storage linked to this catalog. This could be a bucket or a folder within a bucket. For example, 's3://databend-toronto/'. |
62
-
| connection_parameter | Yes | Connection parameters to establish connections with external storage. The required parameters vary based on the specific storage service and authentication methods. Refer to [Connection Parameters](/sql/sql-reference/connect-parameters) for detailed information. |
|`<catalog_name>`| Yes | The name of the catalog you want to create. |
61
+
|`TYPE`| Yes | Specifies the catalog type. For Iceberg, set to `ICEBERG`. |
62
+
|`CONNECTION`| Yes | The connection parameters for the Iceberg catalog. |
63
+
|`TYPE` (inside `CONNECTION`) | Yes | The connection type. For Iceberg, it is typically set to `rest` for REST-based connection. |
64
+
|`ADDRESS`| Yes | The address or URL of the Iceberg service (e.g., `http://127.0.0.1:8181`). |
65
+
|`WAREHOUSE`| Yes | The location of the Iceberg warehouse, usually an S3 bucket or compatible object storage system. |
66
+
|`<connection_parameter>`| Yes | Connection parameters to establish connections with external storage. The required parameters vary based on the specific storage service and authentication methods. Refer to [Connection Parameters](/sql/sql-reference/connect-parameters) for detailed information. If you're using Amazon S3 or S3-compatible storage systems, make sure to prefix the parameters with `s3.` (e.g., `s3.region`, `s3.endpoint`). |
63
67
64
68
:::note
65
69
To read data from HDFS, you need to set the following environment variables before starting Databend. These environment variables ensure that Databend can access the necessary Java and Hadoop dependencies to interact with HDFS effectively. Make sure to replace "/path/to/java" and "/path/to/hadoop" with the actual paths to your Java and Hadoop installations, and adjust the CLASSPATH to include all the required Hadoop JAR files.
@@ -103,24 +107,16 @@ USE CATALOG <catalog_name>
103
107
104
108
## Usage Examples
105
109
106
-
This example demonstrates the creation of a catalog configured to interact with an Iceberg data storage located in MinIO at 's3://databend/iceberg/'.
110
+
This example shows how to create an Iceberg catalog using a REST-based connection, specifying the service address, warehouse location (S3), and optional parameters like AWS region and custom endpoint:
0 commit comments