@@ -76,9 +76,12 @@ Rows selected via an Optic query can be exported to any of the below file format
7676
7777The ` export-avro-files ` command writes one or more Avro files to the directory specified by the ` --path ` option. This
7878command reuses Spark's support for writing Avro files. You can include any of the
79- [ Spark Avro options] ( https://spark.apache.org/docs/latest/sql-data-sources-avro.html ) via the ` -P ` option to
79+ [ Spark Avro data source options] ( https://spark.apache.org/docs/latest/sql-data-sources-avro.html ) via the ` -P ` option to
8080control how Avro content is written. These options are expressed as ` -PoptionName=optionValue ` .
8181
82+ For configuration options listed in the above Spark Avro guide, use the ` -C ` option instead. For example,
83+ ` -Cspark.sql.avro.compression.codec=deflate ` would change the type of compression used for writing Avro files.
84+
8285### Delimited text
8386
8487The ` export-delimited-files ` command writes one or more delimited text (commonly CSV) files to the directory
@@ -125,16 +128,22 @@ By default, each file will be written using the UTF-8 encoding. You can specify
125128
126129The ` export-orc-files ` command writes one or more ORC files to the directory specified by the ` --path ` option. This
127130command reuses Spark's support for writing ORC files. You can include any of the
128- [ Spark ORC options] ( https://spark.apache.org/docs/latest/sql-data-sources-orc.html ) via the ` -P ` option to
131+ [ Spark ORC data source options] ( https://spark.apache.org/docs/latest/sql-data-sources-orc.html ) via the ` -P ` option to
129132control how ORC content is written. These options are expressed as ` -PoptionName=optionValue ` .
130133
134+ For configuration options listed in the above Spark ORC guide, use the ` -C ` option instead. For example,
135+ ` -Cspark.sql.orc.impl=hive ` would change the type of ORC implementation.
136+
131137### Parquet
132138
133139The ` export-parquet-files ` command writes one or more Parquet files to the directory specified by the ` --path ` option. This
134140command reuses Spark's support for writing Parquet files. You can include any of the
135- [ Spark Parquet options] ( https://spark.apache.org/docs/latest/sql-data-sources-parquet.html ) via the ` -P ` option to
141+ [ Spark Parquet data source options] ( https://spark.apache.org/docs/latest/sql-data-sources-parquet.html ) via the ` -P ` option to
136142control how Parquet content is written. These options are expressed as ` -PoptionName=optionValue ` .
137143
144+ For configuration options listed in the above Spark Parquet guide, use the ` -C ` option instead. For example,
145+ ` -Cspark.sql.parquet.compression.codec=gzip ` would change the compressed used for writing Parquet files.
146+
138147## Controlling the save mode
139148
140149Each of the commands for exporting rows to files supports a ` --mode ` option that controls how data is written to a
0 commit comments