-
-
Notifications
You must be signed in to change notification settings - Fork 152
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Build error on sun.*
APIs when -release
Java version is lower than the execution Java version
#722
Comments
friendly ping @slandelle for this issue, I don’t know enough about the details of this project yet, do you think it’s rational to add a toggle for the |
Thank you, @LuciferYang . |
dongjoon-hyun
pushed a commit
to apache/spark
that referenced
this issue
Sep 13, 2023
### What changes were proposed in this pull request? This pr downgrade `scala-maven-plugin` to version 4.7.1 to avoid it automatically adding the `-release` option as a Scala compilation argument. ### Why are the changes needed? The `scala-maven-plugin` versions 4.7.2 and later will try to automatically append the `-release` option as a Scala compilation argument when it is not specified by the user: 1. 4.7.2 and 4.8.0: try to add the `-release` option for Scala versions 2.13.9 and higher. 2. 4.8.1: try to append the `-release` option for Scala versions 2.12.x/2.13.x/3.1.1, and append `-java-output-version` for Scala 3.1.2. The addition of the `-release` option has caused issues mentioned in SPARK-44376 | #41943 and #40442 (comment). This is because the `-release` option has stronger compilation restrictions than `-target`, ensuring not only bytecode format, but also that the API used in the code is compatible with the specified version of Java. However, many APIs in the `sun.*` package are not `exports` in Java 11, 17, and 21, such as `sun.nio.ch.DirectBuffer`, `sun.util.calendar.ZoneInfo`, and `sun.nio.cs.StreamDecoder`, making them invisible when compiling across different versions. For discussions within the Scala community, see scala/bug#12643, scala/bug#12824, scala/bug#12866, but this is not a bug. I have also submitted an issue to the `scala-maven-plugin` community to discuss the possibility of adding additional settings to control the addition of the `-release` option: davidB/scala-maven-plugin#722. For Apache Spark 4.0, in the short term, I suggest downgrading `scala-maven-plugin` to version 4.7.1 to avoid it automatic adding the `-release` option as a Scala compilation argument. In the long term, we should reduce use of APIs that are not `exports` for compatibility with the `-release` compilation option due to `-target` already deprecated after Scala 2.13.9. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? - Pass GitHub Actions - Manual check run `git revert 656bf36` to revert to using Scala 2.13.11 and run `dev/change-scala-version.sh 2.13` to change Scala to 2.13 1. Run `build/mvn clean install -DskipTests -Pscala-2.13 -X` to check the Scala compilation arguments. Before ``` [[DEBUG] [zinc] Running cached compiler 1992eaf4 for Scala compiler version 2.13.11 [DEBUG] [zinc] The Scala compiler is invoked with: -unchecked -deprecation -feature -explaintypes -target:jvm-1.8 -Wconf:cat=deprecation:wv,any:e -Wunused:imports -Wconf:cat=scaladoc:wv -Wconf:cat=lint-multiarg-infix:wv -Wconf:cat=other-nullary-override:wv -Wconf:cat=other-match-analysis&site=org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupFunction.catalogFunction:wv -Wconf:cat=other-pure-statement&site=org.apache.spark.streaming.util.FileBasedWriteAheadLog.readAll.readFile:wv -Wconf:cat=other-pure-statement&site=org.apache.spark.scheduler.OutputCommitCoordinatorSuite.<local OutputCommitCoordinatorSuite>.futureAction:wv -Wconf:msg=^(?=.*?method|value|type|object|trait|inheritance)(?=.*?deprecated)(?=.*?since 2.13).+$:s -Wconf:msg=^(?=.*?Widening conversion from)(?=.*?is deprecated because it loses precision).+$:s -Wconf:msg=Auto-application to \`\(\)\` is deprecated:s -Wconf:msg=method with a single empty parameter list overrides method without any parameter list:s -Wconf:msg=method without a parameter list overrides a method with a single empty one:s -Wconf:cat=deprecation&msg=procedure syntax is deprecated:e -Wconf:cat=unchecked&msg=outer reference:s -Wconf:cat=unchecked&msg=eliminated by erasure:s -Wconf:msg=^(?=.*?a value of type)(?=.*?cannot also be).+$:s -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBase.scala:s -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBaseOps.scala:s -Wconf:msg=Implicit definition should have explicit type:s -release 8 -bootclasspath ... ``` After ``` [DEBUG] [zinc] Running cached compiler 72dd4888 for Scala compiler version 2.13.11 [DEBUG] [zinc] The Scala compiler is invoked with: -unchecked -deprecation -feature -explaintypes -target:jvm-1.8 -Wconf:cat=deprecation:wv,any:e -Wunused:imports -Wconf:cat=scaladoc:wv -Wconf:cat=lint-multiarg-infix:wv -Wconf:cat=other-nullary-override:wv -Wconf:cat=other-match-analysis&site=org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupFunction.catalogFunction:wv -Wconf:cat=other-pure-statement&site=org.apache.spark.streaming.util.FileBasedWriteAheadLog.readAll.readFile:wv -Wconf:cat=other-pure-statement&site=org.apache.spark.scheduler.OutputCommitCoordinatorSuite.<local OutputCommitCoordinatorSuite>.futureAction:wv -Wconf:msg=^(?=.*?method|value|type|object|trait|inheritance)(?=.*?deprecated)(?=.*?since 2.13).+$:s -Wconf:msg=^(?=.*?Widening conversion from)(?=.*?is deprecated because it loses precision).+$:s -Wconf:msg=Auto-application to \`\(\)\` is deprecated:s -Wconf:msg=method with a single empty parameter list overrides method without any parameter list:s -Wconf:msg=method without a parameter list overrides a method with a single empty one:s -Wconf:cat=deprecation&msg=procedure syntax is deprecated:e -Wconf:cat=unchecked&msg=outer reference:s -Wconf:cat=unchecked&msg=eliminated by erasure:s -Wconf:msg=^(?=.*?a value of type)(?=.*?cannot also be).+$:s -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBase.scala:s -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBaseOps.scala:s -Wconf:msg=Implicit definition should have explicit type:s -target:8 -bootclasspath ... ``` After downgrading the version, the `-release` option should no longer appear in the compilation arguments. 2. Maven can build the project with Java 17 without the issue described in #41943. And after this pr, we can re-upgrade Scala 2.13 to Scala 2.13.11. ### Was this patch authored or co-authored using generative AI tooling? No Closes #42899 from LuciferYang/SPARK-45144. Authored-by: yangjie01 <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
szehon-ho
pushed a commit
to szehon-ho/spark
that referenced
this issue
Aug 7, 2024
This pr downgrade `scala-maven-plugin` to version 4.7.1 to avoid it automatically adding the `-release` option as a Scala compilation argument. The `scala-maven-plugin` versions 4.7.2 and later will try to automatically append the `-release` option as a Scala compilation argument when it is not specified by the user: 1. 4.7.2 and 4.8.0: try to add the `-release` option for Scala versions 2.13.9 and higher. 2. 4.8.1: try to append the `-release` option for Scala versions 2.12.x/2.13.x/3.1.1, and append `-java-output-version` for Scala 3.1.2. The addition of the `-release` option has caused issues mentioned in SPARK-44376 | apache#41943 and apache#40442 (comment). This is because the `-release` option has stronger compilation restrictions than `-target`, ensuring not only bytecode format, but also that the API used in the code is compatible with the specified version of Java. However, many APIs in the `sun.*` package are not `exports` in Java 11, 17, and 21, such as `sun.nio.ch.DirectBuffer`, `sun.util.calendar.ZoneInfo`, and `sun.nio.cs.StreamDecoder`, making them invisible when compiling across different versions. For discussions within the Scala community, see scala/bug#12643, scala/bug#12824, scala/bug#12866, but this is not a bug. I have also submitted an issue to the `scala-maven-plugin` community to discuss the possibility of adding additional settings to control the addition of the `-release` option: davidB/scala-maven-plugin#722. For Apache Spark 4.0, in the short term, I suggest downgrading `scala-maven-plugin` to version 4.7.1 to avoid it automatic adding the `-release` option as a Scala compilation argument. In the long term, we should reduce use of APIs that are not `exports` for compatibility with the `-release` compilation option due to `-target` already deprecated after Scala 2.13.9. No - Pass GitHub Actions - Manual check run `git revert 656bf36` to revert to using Scala 2.13.11 and run `dev/change-scala-version.sh 2.13` to change Scala to 2.13 1. Run `build/mvn clean install -DskipTests -Pscala-2.13 -X` to check the Scala compilation arguments. Before ``` [[DEBUG] [zinc] Running cached compiler 1992eaf4 for Scala compiler version 2.13.11 [DEBUG] [zinc] The Scala compiler is invoked with: -unchecked -deprecation -feature -explaintypes -target:jvm-1.8 -Wconf:cat=deprecation:wv,any:e -Wunused:imports -Wconf:cat=scaladoc:wv -Wconf:cat=lint-multiarg-infix:wv -Wconf:cat=other-nullary-override:wv -Wconf:cat=other-match-analysis&site=org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupFunction.catalogFunction:wv -Wconf:cat=other-pure-statement&site=org.apache.spark.streaming.util.FileBasedWriteAheadLog.readAll.readFile:wv -Wconf:cat=other-pure-statement&site=org.apache.spark.scheduler.OutputCommitCoordinatorSuite.<local OutputCommitCoordinatorSuite>.futureAction:wv -Wconf:msg=^(?=.*?method|value|type|object|trait|inheritance)(?=.*?deprecated)(?=.*?since 2.13).+$:s -Wconf:msg=^(?=.*?Widening conversion from)(?=.*?is deprecated because it loses precision).+$:s -Wconf:msg=Auto-application to \`\(\)\` is deprecated:s -Wconf:msg=method with a single empty parameter list overrides method without any parameter list:s -Wconf:msg=method without a parameter list overrides a method with a single empty one:s -Wconf:cat=deprecation&msg=procedure syntax is deprecated:e -Wconf:cat=unchecked&msg=outer reference:s -Wconf:cat=unchecked&msg=eliminated by erasure:s -Wconf:msg=^(?=.*?a value of type)(?=.*?cannot also be).+$:s -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBase.scala:s -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBaseOps.scala:s -Wconf:msg=Implicit definition should have explicit type:s -release 8 -bootclasspath ... ``` After ``` [DEBUG] [zinc] Running cached compiler 72dd4888 for Scala compiler version 2.13.11 [DEBUG] [zinc] The Scala compiler is invoked with: -unchecked -deprecation -feature -explaintypes -target:jvm-1.8 -Wconf:cat=deprecation:wv,any:e -Wunused:imports -Wconf:cat=scaladoc:wv -Wconf:cat=lint-multiarg-infix:wv -Wconf:cat=other-nullary-override:wv -Wconf:cat=other-match-analysis&site=org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupFunction.catalogFunction:wv -Wconf:cat=other-pure-statement&site=org.apache.spark.streaming.util.FileBasedWriteAheadLog.readAll.readFile:wv -Wconf:cat=other-pure-statement&site=org.apache.spark.scheduler.OutputCommitCoordinatorSuite.<local OutputCommitCoordinatorSuite>.futureAction:wv -Wconf:msg=^(?=.*?method|value|type|object|trait|inheritance)(?=.*?deprecated)(?=.*?since 2.13).+$:s -Wconf:msg=^(?=.*?Widening conversion from)(?=.*?is deprecated because it loses precision).+$:s -Wconf:msg=Auto-application to \`\(\)\` is deprecated:s -Wconf:msg=method with a single empty parameter list overrides method without any parameter list:s -Wconf:msg=method without a parameter list overrides a method with a single empty one:s -Wconf:cat=deprecation&msg=procedure syntax is deprecated:e -Wconf:cat=unchecked&msg=outer reference:s -Wconf:cat=unchecked&msg=eliminated by erasure:s -Wconf:msg=^(?=.*?a value of type)(?=.*?cannot also be).+$:s -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBase.scala:s -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBaseOps.scala:s -Wconf:msg=Implicit definition should have explicit type:s -target:8 -bootclasspath ... ``` After downgrading the version, the `-release` option should no longer appear in the compilation arguments. 2. Maven can build the project with Java 17 without the issue described in apache#41943. And after this pr, we can re-upgrade Scala 2.13 to Scala 2.13.11. No Closes apache#42899 from LuciferYang/SPARK-45144. Authored-by: yangjie01 <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
4 tasks
mamccorm
pushed a commit
to wolfi-dev/os
that referenced
this issue
Jan 17, 2025
) **Context and implications of the changes:** The inability to update `org.scala-lang:scala-library` to version `2.13.9` or higher in `spark-3.5.x` arises from a critical build issue documented in [SPARK-44376](https://issues.apache.org/jira/browse/SPARK-44376). This problem stems from the transition to Scala `2.13.11` ([SPARK-40497](https://issues.apache.org/jira/browse/SPARK-40497)) and the deprecation of the `-target` argument in favor of `-release` in the [scala-maven-plugin](davidB/scala-maven-plugin#722). This change introduces stricter compatibility checks, breaking builds when using Java 11 or later. The key errors include inaccessible `sun.*` classes like `DirectBuffer` and `Unsafe`, which are not exported in Java versions above 8 ([source](scala/bug#12643)). Despite manual attempts to adjust Maven configurations, such as switching `-target:jvm-1.8` to `-release:8`, compilation failures persist due to inherent restrictions imposed by the `-release` argument. This issue is compounded by the Scala Maven plugin's automatic addition of the `-release` argument for Scala `2.13.9` and above, leading to incompatibilities when targeting Java 8 compatibility while running on newer Java versions. While upstream remediated by upgrading to `2.13.11`, even `2.13.9` is not possible with `scala-maven-plugin` version `4.8.0`. Currently, `scala.version` is hardcoded to `2.13.8`, and fixing this requires implementation of the following PRs: 1. https://github.com/apache/spark/pull/41626/files 2. https://github.com/apache/spark/pull/42899/files **The only reason remediation is achievable is due to the following conditions:** 1. It is already merged upstream and awaiting a `4.0.0` release. 2. The commits are not intertwined with other more complicated initiatives or functional changes. 3. Support for Java 8 runtime dependency is not dropped with this implementation. 4. This is a critical CVE. 5. Thorough package and image-level testing is in place. 6. Downgrading `scala-maven-plugin` to `4.7.1` does not introduce any new CVEs beyond those already existing in `4.8.0`. **This was a more involved remediation due to:** 1. The dynamic setting of this version property via [change-scala-version.sh](https://github.com/apache/spark/blob/3b0ac45391708642ce6a1779e3c234bab0e40b66/dev/change-scala-version.sh#L72C1-L81C21). 2. The malformed `scala.version` property, which does not follow usual `pom.xml` conventions, with the same property value defined twice in the `pom.xml`, causing maven/pombump to not function. 3. The incredibly careful and diligent investigation required to gain confidence in the fix, alongside the experience required from working on previous spark-3.5 issues. Bonus: Was able to tack on an additional remediation for ivy/GHSA-2jc4-r94c-rp7h --------- Signed-off-by: jamie-albert <[email protected]>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Here is the simplest reproducible example: https://github.com/LuciferYang/scala-maven-plugin-test
The reproducible code is very simple:
run
We can see the following compilation error:
mvn clean install -Dscala-maven-plugin.version=4.8.0
andmvn clean install -Dscala-maven-plugin.version=4.7.2
will have the same error.mvn clean install -Dscala-maven-plugin.version=4.7.1
will build successThe
-release
option is automatically appended after #648.Maybe we need to add an new option that doesn’t enforce appending
-release
when it’s false?The text was updated successfully, but these errors were encountered: