|
1 | 1 | ---
|
2 |
| -title: Iceberg Catalog |
| 2 | +title: Apache Iceberg |
3 | 3 | ---
|
4 | 4 | import FunctionDescription from '@site/src/components/FunctionDescription';
|
5 | 5 |
|
@@ -63,7 +63,26 @@ CONNECTION=(
|
63 | 63 | | `TYPE` (inside `CONNECTION`) | Yes | The connection type. For Iceberg, it is typically set to `rest` for REST-based connection. |
|
64 | 64 | | `ADDRESS` | Yes | The address or URL of the Iceberg service (e.g., `http://127.0.0.1:8181`). |
|
65 | 65 | | `WAREHOUSE` | Yes | The location of the Iceberg warehouse, usually an S3 bucket or compatible object storage system. |
|
66 |
| -| `<connection_parameter>` | Yes | Connection parameters to establish connections with external storage. The required parameters vary based on the specific storage service and authentication methods. Refer to [Connection Parameters](/sql/sql-reference/connect-parameters) for detailed information. If you're using Amazon S3 or S3-compatible storage systems, make sure to prefix the parameters with `s3.` (e.g., `s3.region`, `s3.endpoint`). | |
| 66 | +| `<connection_parameter>` | Yes | Connection parameters to establish connections with external storage. The required parameters vary based on the specific storage service and authentication methods. See the table below for a full list of the available parameters. | |
| 67 | + |
| 68 | +| Connection Parameter | Description | |
| 69 | +|-----------------------------------|----------------------------------------------------------------------------------------------------------------------------------------| |
| 70 | +| `s3.endpoint` | S3 endpoint. | |
| 71 | +| `s3.access-key-id` | S3 access key ID. | |
| 72 | +| `s3.secret-access-key` | S3 secret access key. | |
| 73 | +| `s3.session-token` | S3 session token, required when using temporary credentials. | |
| 74 | +| `s3.region` | S3 region. | |
| 75 | +| `client.region` | Region to use for the S3 client, takes precedence over `s3.region`. | |
| 76 | +| `s3.path-style-access` | S3 Path Style Access. | |
| 77 | +| `s3.sse.type` | S3 Server-Side Encryption (SSE) type. | |
| 78 | +| `s3.sse.key` | S3 SSE key. If encryption type is `kms`, this is a KMS Key ID. If encryption type is `custom`, this is a base-64 AES256 symmetric key. | |
| 79 | +| `s3.sse.md5` | S3 SSE MD5 checksum. | |
| 80 | +| `client.assume-role.arn` | ARN of the IAM role to assume instead of using the default credential chain. | |
| 81 | +| `client.assume-role.external-id` | Optional external ID used to assume an IAM role. | |
| 82 | +| `client.assume-role.session-name` | Optional session name used to assume an IAM role. | |
| 83 | +| `s3.allow-anonymous` | Option to allow anonymous access (e.g., for public buckets/folders). | |
| 84 | +| `s3.disable-ec2-metadata` | Option to disable loading credentials from EC2 metadata (typically used with `s3.allow-anonymous`). | |
| 85 | +| `s3.disable-config-load` | Option to disable loading configuration from config files and environment variables. | |
67 | 86 |
|
68 | 87 | :::note
|
69 | 88 | To read data from HDFS, you need to set the following environment variables before starting Databend. These environment variables ensure that Databend can access the necessary Java and Hadoop dependencies to interact with HDFS effectively. Make sure to replace "/path/to/java" and "/path/to/hadoop" with the actual paths to your Java and Hadoop installations, and adjust the CLASSPATH to include all the required Hadoop JAR files.
|
|
0 commit comments