@@ -74,13 +74,13 @@ You can link against this library in your program at the following coordinates:
74
74
</tr >
75
75
<tr >
76
76
<td >
77
- <pre >groupId: za.co.absa.cobrix<br >artifactId: spark-cobol_2.11<br >version: 2.6.11 </pre >
77
+ <pre >groupId: za.co.absa.cobrix<br >artifactId: spark-cobol_2.11<br >version: 2.7.0 </pre >
78
78
</td >
79
79
<td >
80
- <pre >groupId: za.co.absa.cobrix<br >artifactId: spark-cobol_2.12<br >version: 2.6.11 </pre >
80
+ <pre >groupId: za.co.absa.cobrix<br >artifactId: spark-cobol_2.12<br >version: 2.7.0 </pre >
81
81
</td >
82
82
<td >
83
- <pre >groupId: za.co.absa.cobrix<br >artifactId: spark-cobol_2.13<br >version: 2.6.11 </pre >
83
+ <pre >groupId: za.co.absa.cobrix<br >artifactId: spark-cobol_2.13<br >version: 2.7.0 </pre >
84
84
</td >
85
85
</tr >
86
86
</table >
@@ -91,17 +91,17 @@ This package can be added to Spark using the `--packages` command line option. F
91
91
92
92
### Spark compiled with Scala 2.11
93
93
```
94
- $SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.11:2.6.11
94
+ $SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.11:2.7.0
95
95
```
96
96
97
97
### Spark compiled with Scala 2.12
98
98
```
99
- $SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.12:2.6.11
99
+ $SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.12:2.7.0
100
100
```
101
101
102
102
### Spark compiled with Scala 2.13
103
103
```
104
- $SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.13:2.6.11
104
+ $SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.13:2.7.0
105
105
```
106
106
107
107
## Usage
@@ -238,17 +238,17 @@ to decode various binary formats.
238
238
239
239
The jars that you need to get are:
240
240
241
- * spark-cobol_2.12-2.6.11 .jar
242
- * cobol-parser_2.12-2.6.11 .jar
241
+ * spark-cobol_2.12-2.7.0 .jar
242
+ * cobol-parser_2.12-2.7.0 .jar
243
243
* scodec-core_2.12-1.10.3.jar
244
244
* scodec-bits_2.12-1.1.4.jar
245
245
* antlr4-runtime-4.8.jar
246
246
247
247
After that you can specify these jars in ` spark-shell ` command line. Here is an example:
248
248
```
249
- $ spark-shell --packages za.co.absa.cobrix:spark-cobol_2.12:2.6.11
249
+ $ spark-shell --packages za.co.absa.cobrix:spark-cobol_2.12:2.7.0
250
250
or
251
- $ spark-shell --master yarn --deploy-mode client --driver-cores 4 --driver-memory 4G --jars spark-cobol_2.12-2.6.11 .jar,cobol-parser_2.12-2.6.11 .jar,scodec-core_2.12-1.10.3.jar,scodec-bits_2.12-1.1.4.jar,antlr4-runtime-4.8.jar
251
+ $ spark-shell --master yarn --deploy-mode client --driver-cores 4 --driver-memory 4G --jars spark-cobol_2.12-2.7.0 .jar,cobol-parser_2.12-2.7.0 .jar,scodec-core_2.12-1.10.3.jar,scodec-bits_2.12-1.1.4.jar,antlr4-runtime-4.8.jar
252
252
253
253
Setting default log level to "WARN".
254
254
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
@@ -315,7 +315,7 @@ Creating an uber jar for Cobrix is very easy. Steps to build:
315
315
316
316
You can collect the uber jar of ` spark-cobol` either at
317
317
` spark-cobol/target/scala-2.11/` or in ` spark-cobol/target/scala-2.12/` depending on the Scala version you used.
318
- The fat jar will have ' -bundle' suffix. You can also download pre-built bundles from https://github.com/AbsaOSS/cobrix/releases/tag/v2.6.11
318
+ The fat jar will have ' -bundle' suffix. You can also download pre-built bundles from https://github.com/AbsaOSS/cobrix/releases/tag/v2.7.0
319
319
320
320
Then, run ` spark-shell` or ` spark-submit` adding the fat jar as the option.
321
321
` ` ` sh
0 commit comments