-
Notifications
You must be signed in to change notification settings - Fork 245
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix collection_ops_tests
for Spark 4.0 [databricks]
#11414
Fix collection_ops_tests
for Spark 4.0 [databricks]
#11414
Conversation
Fixes NVIDIA#11011. This commit fixes the failures in `collection_ops_tests` on Spark 4.0. On all versions of Spark, when a Sequence is collected with rows that exceed MAX_INT, an exception is thrown indicating that the collected Sequence/array is larger than permissible. The different versions of Spark vary in the contents of the exception message. On Spark 4, one sees that the error message now contains more information than all prior versions, including: 1. The name of the op causing the error 2. The errant sequence size This commit introduces a shim to make this new information available in the exception. Note that this shim does not fit cleanly in RapidsErrorUtils, because there are differences within major Spark versions. For instance, Spark 3.4.0-1 have a different message as compared to 3.4.2 and 3.4.3. Likewise, the differences in 3.5.0, 3.5.1, 3.5.2. Signed-off-by: MithunR <[email protected]>
5606503
to
b8bd960
Compare
Build |
sql-plugin/src/main/spark320/scala/com/nvidia/spark/rapids/shims/GetSequenceSize.scala
Outdated
Show resolved
Hide resolved
sql-plugin/src/main/spark400/scala/org/apache/spark/sql/rapids/shims/SequenceSizeError.scala
Outdated
Show resolved
Hide resolved
This moves the construction of the long-sequence error strings into RapidsErrorUtils. The process involved introducing many new RapidsErrorUtils classes, and using mix-ins of concrete implementations for the error-string construction.
Apologies for the noise. I had to rebase this to target @razajafri is already examining this change. The others can ignore this. |
Build |
collection_ops_tests
for Spark 4.0collection_ops_tests
for Spark 4.0 [databricks]
Build |
sql-plugin/src/main/spark330/scala/org/apache/spark/sql/rapids/shims/RapidsErrorUtils.scala
Outdated
Show resolved
Hide resolved
sql-plugin/src/main/spark330db/scala/org/apache/spark/sql/rapids/shims/RapidsErrorUtils.scala
Outdated
Show resolved
Hide resolved
sql-plugin/src/main/spark334/scala/org/apache/spark/sql/rapids/shims/RapidsErrorUtils.scala
Outdated
Show resolved
Hide resolved
sql-plugin/src/main/spark340/scala/org/apache/spark/sql/rapids/shims/RapidsErrorUtils.scala
Outdated
Show resolved
Hide resolved
sql-plugin/src/main/spark341db/scala/org/apache/spark/sql/rapids/shims/RapidsErrorUtils.scala
Outdated
Show resolved
Hide resolved
sql-plugin/src/main/spark342/scala/org/apache/spark/sql/rapids/shims/RapidsErrorUtils.scala
Outdated
Show resolved
Hide resolved
sql-plugin/src/main/spark400/scala/org/apache/spark/sql/rapids/shims/RapidsErrorUtils.scala
Outdated
Show resolved
Hide resolved
I've addressed the code formatting suggestions. |
Build |
sql-plugin/src/main/spark320/scala/com/nvidia/spark/rapids/shims/GetSequenceSize.scala
Show resolved
Hide resolved
sql-plugin/src/main/spark341db/scala/org/apache/spark/sql/rapids/shims/RapidsErrorUtils.scala
Show resolved
Hide resolved
.../src/main/spark320/scala/com/nvidia/spark/rapids/shims/SequenceSizeTooLongErrorBuilder.scala
Show resolved
Hide resolved
Build |
1 similar comment
Build |
@razajafri: After addressing your whitespace concerns, I had started evaluating support for the 14.3 failure. I had a couple of markers to indicate changes, but I've closed them now. The behaviour with If this current state looks good to you for shipping, I'll merge this change, and address |
Build |
Thanks for the review, @razajafri. I have merged this change. |
Fixes #11011.
This commit fixes the failures in
collection_ops_tests
on Spark 4.0.On all versions of Spark, when a Sequence is collected with rows that exceed MAX_INT,
an exception is thrown indicating that the collected Sequence/array is
larger than permissible. The different versions of Spark vary in the
contents of the exception message.
On Spark 4, one sees that the error message now contains more
information than all prior versions, including:
This commit introduces a shim to make this new information available in
the exception.
Note that this shim does not fit cleanly in RapidsErrorUtils, because
there are differences within major Spark versions. For instance, Spark
3.4.0-1 have a different message as compared to 3.4.2 and 3.4.3.
Likewise, the differences in 3.5.0, 3.5.1, 3.5.2.