Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgraded Arrow to 18 #2576

Merged
merged 7 commits into from
Feb 17, 2025
Merged

Upgraded Arrow to 18 #2576

merged 7 commits into from
Feb 17, 2025

Conversation

AbdulR3hman
Copy link
Contributor

@AbdulR3hman AbdulR3hman commented Feb 7, 2025

Issue #, if available:
Upgrading to Arrow 18; this will move of from few CVEs as well as bring us closer to the latest version of arrow. requires further testing; release testing, etc.

Description of changes:

Most notable changes are:

  • JdbcArrowTypeConverter returns an Optional; this helps readability and better than null checking
  • Excluding arrow packages that were conflicting from google-big-query; version 15 was coming instead of 18. This requires manual testing of google-big-query

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

@AbdulR3hman AbdulR3hman force-pushed the arrow-upgrade branch 3 times, most recently from 8feaed1 to cad94a7 Compare February 13, 2025 22:18
@awslabs awslabs deleted a comment from codecov bot Feb 14, 2025
Copy link

codecov bot commented Feb 14, 2025

Codecov Report

Attention: Patch coverage is 52.54237% with 56 lines in your changes missing coverage. Please review.

Project coverage is 60.88%. Comparing base (f3521ad) to head (bff0d07).
Report is 49 commits behind head on master.

✅ All tests successful. No failed tests found.

Files with missing lines Patch % Lines
...ena/connectors/cloudera/ImpalaMetadataHandler.java 16.66% 8 Missing and 2 partials ⚠️
...naws/athena/connectors/db2/Db2MetadataHandler.java 25.00% 3 Missing and 3 partials ⚠️
...a/connectors/db2as400/Db2As400MetadataHandler.java 28.57% 3 Missing and 2 partials ⚠️
...thena/connectors/oracle/OracleMetadataHandler.java 50.00% 4 Missing and 1 partial ⚠️
...ena/connectors/saphana/SaphanaMetadataHandler.java 44.44% 3 Missing and 2 partials ⚠️
...tors/datalakegen2/DataLakeGen2MetadataHandler.java 42.85% 2 Missing and 2 partials ⚠️
...connectors/snowflake/SnowflakeMetadataHandler.java 50.00% 2 Missing and 2 partials ⚠️
...connectors/sqlserver/SqlServerMetadataHandler.java 42.85% 2 Missing and 2 partials ⚠️
...ena/connectors/synapse/SynapseMetadataHandler.java 42.85% 2 Missing and 2 partials ⚠️
...a/connectors/jdbc/manager/JdbcMetadataHandler.java 62.50% 0 Missing and 3 partials ⚠️
... and 3 more
Additional details and impacted files
@@             Coverage Diff              @@
##             master    #2576      +/-   ##
============================================
+ Coverage     60.68%   60.88%   +0.20%     
- Complexity     3871     3888      +17     
============================================
  Files           593      593              
  Lines         22130    22144      +14     
  Branches       2732     2738       +6     
============================================
+ Hits          13430    13483      +53     
+ Misses         7398     7349      -49     
- Partials       1302     1312      +10     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

}
if (columnType != null && !SupportedTypes.isSupported(columnType)) {
columnType = Types.MinorType.VARCHAR.getType();
if (columnType.isPresent() && !SupportedTypes.isSupported(columnType.get())) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These 2 conditions can be combined

}

LOGGER.debug("columnType: " + columnType);
if (columnType != null && SupportedTypes.isSupported(columnType)) {
schemaBuilder.addField(FieldBuilder.newBuilder(columnName, columnType).build());
if (columnType.isPresent() && SupportedTypes.isSupported(columnType.get())) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These 2 conditions can be combined

}

if (columnType != null && SupportedTypes.isSupported(columnType)) {
LOGGER.debug("Adding column {} to schema of type {}", columnName, columnType);
if (columnType.isPresent() && SupportedTypes.isSupported(columnType.get())) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

isPresent no really necessary here as 344 will handle

@chngpe
Copy link
Contributor

chngpe commented Feb 17, 2025

Discuss with team, refactoring part will comes as a separate PR

@chngpe chngpe merged commit 0c0a839 into master Feb 17, 2025
9 of 10 checks passed
@chngpe chngpe deleted the arrow-upgrade branch February 17, 2025 18:32
@chngpe chngpe restored the arrow-upgrade branch February 17, 2025 18:34
@chngpe chngpe deleted the arrow-upgrade branch February 17, 2025 18:56
github-actions bot pushed a commit that referenced this pull request Feb 17, 2025
  - Upgraded Arrow to 18 (#2576)
  - Fixed Encoding Warning (#2595)
  - Fix codecov format error (#2594)
  - Update Code Cov on Daily Validation Tests (#2593)
  - Allow custom glue endpoint to be used (#2587)
  - Msk connector was throwing null pointer exception when entire record is null, issue fix. (#2588)
  - Kafka connector was throwing null pointer exception when entire record is null, issue fix. (#2575)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants