Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using -release option instead of -target can break compilation #12824

Closed
eejbyfeldt opened this issue Jul 11, 2023 · 4 comments
Closed

Using -release option instead of -target can break compilation #12824

eejbyfeldt opened this issue Jul 11, 2023 · 4 comments

Comments

@eejbyfeldt
Copy link

eejbyfeldt commented Jul 11, 2023

Reproduction steps

Scala version: 2.13.11
Java version: 17.0.7

$ cat import_sun.scala
import sun._
$ scalac -deprecation -target:8 import_sun.scala
warning: -target is deprecated: Use -release instead to compile against the correct platform API.
1 warning
$ scalac -release:8 import_sun.scala
import_sun.scala:1: error: not found: object sun
import sun._
       ^
1 error

Problem

Compiling with the -target flag gives a deprecation warning suggesting to use -release instead. But switching using -release means that the code no longer compiles.

If this is the intended behavior maybe the deprecation warnings/documentation should include more information of the differences between -target and -release.

@lrytz
Copy link
Member

lrytz commented Jul 11, 2023

Compiling on JDK 17 with -target:8 is not safe

➜ sandbox cat Test.scala
object Test {
  def main(args: Array[String]): Unit = {
    println(">".indent(2))
  }
}

➜ sandbox java17
➜ sandbox scalac -target:8 Test.scala
warning: 1 deprecation; re-run with -deprecation for details
1 warning
➜ sandbox scala Test
  >

➜ sandbox java8
➜ sandbox scala Test
Exception in thread "main" java.lang.NoSuchMethodError: java.lang.String.indent(I)Ljava/lang/String;
	at Test$.main(Test.scala:3)
	at Test.main(Test.scala)

Using -release instead fixes that

➜ sandbox java17
➜ sandbox scalac -release:8 Test.scala
Test.scala:3: error: value indent is not a member of String
did you mean indexOf or intern?
    println(">".indent(2))
                ^

However, the corresponding classpath element that is used under -release instead of the JDK doesn't contain symbols for private API (such as sun._). So you need to build on JDK 8 in this case.

@lrytz lrytz closed this as completed Jul 11, 2023
@som-snytt
Copy link

som-snytt commented Jul 11, 2023

Duplicates #12643

The comment at the end of that ticket mentions -javabootclasspath and elsewhere:

To compile against a JDK 8 rt.jar, use -javabootclasspath rt.jar -nobootcp.

@som-snytt som-snytt reopened this Jul 11, 2023
@som-snytt som-snytt closed this as not planned Won't fix, can't repro, duplicate, stale Jul 11, 2023
@eejbyfeldt
Copy link
Author

Thanks for clarifying.

Is this expected that using --release 17 using sun.* classes compiles but fails at runtime?

$ cast use_sun.scala
import sun.nio.ch.DirectBuffer

object Test {
  def main(args: Array[String]): Unit = {
    println(classOf[sun.nio.ch.DirectBuffer])
  }
}
$ scalac -release 17 use_sun.scala 
$ scala -release 17 use_sun.scala 
Exception in thread "main" java.lang.IllegalAccessError: class Main$ (in unnamed module @0xe260766) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0xe260766
	at Main$.main(use_sun.scala:5)
	at Main.main(use_sun.scala)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:568)
	at scala.reflect.internal.util.RichClassLoader$.$anonfun$run$extension$1(ScalaClassLoader.scala:101)
	at scala.reflect.internal.util.RichClassLoader$.asContext$extension(ScalaClassLoader.scala:36)
	at scala.reflect.internal.util.RichClassLoader$.run$extension(ScalaClassLoader.scala:101)
	at scala.tools.nsc.CommonRunner.run(ObjectRunner.scala:30)
	at scala.tools.nsc.CommonRunner.run$(ObjectRunner.scala:28)
	at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:45)
	at scala.tools.nsc.CommonRunner.runAndCatch(ObjectRunner.scala:37)
	at scala.tools.nsc.CommonRunner.runAndCatch$(ObjectRunner.scala:36)
	at scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:45)
	at scala.tools.nsc.AbstractScriptRunner.runCompiled(ScriptRunner.scala:168)
	at scala.tools.nsc.AbstractScriptRunner.$anonfun$runScript$1(ScriptRunner.scala:177)
	at scala.tools.nsc.AbstractScriptRunner.$anonfun$withCompiledScript$9(ScriptRunner.scala:154)
	at scala.tools.nsc.util.package$.trackingThreads(package.scala:55)
	at scala.tools.nsc.util.package$.waitingForThreads(package.scala:32)
	at scala.tools.nsc.AbstractScriptRunner.withCompiledScript(ScriptRunner.scala:151)
	at scala.tools.nsc.AbstractScriptRunner.runScript(ScriptRunner.scala:177)
	at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:75)
	at scala.tools.nsc.MainGenericRunner.run$1(MainGenericRunner.scala:92)
	at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:105)
	at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:113)
	at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)

@lrytz
Copy link
Member

lrytz commented Jul 11, 2023

Yeah, the Scala compiler doesn't do access checks according to the Java module system (scala/scala-dev#529).

https://nipafx.dev/five-command-line-options-hack-java-module-system/ is a good article.

dongjoon-hyun pushed a commit to apache/spark that referenced this issue Sep 13, 2023
### What changes were proposed in this pull request?
This pr downgrade `scala-maven-plugin` to version 4.7.1 to avoid it automatically adding the `-release` option as a Scala compilation argument.

### Why are the changes needed?
The `scala-maven-plugin` versions 4.7.2 and later will try to automatically append the `-release` option as a Scala compilation argument when it is not specified by the user:

1. 4.7.2 and 4.8.0: try to add the `-release` option for Scala versions 2.13.9 and higher.
2. 4.8.1: try to append the `-release` option for Scala versions 2.12.x/2.13.x/3.1.1, and append `-java-output-version` for Scala 3.1.2.

The addition of the `-release` option has caused issues mentioned in SPARK-44376 | #41943 and #40442 (comment). This is because the `-release` option has stronger compilation restrictions than `-target`, ensuring not only bytecode format, but also that the API used in the code is compatible with the specified version of Java. However, many APIs in the `sun.*` package are not `exports` in Java 11, 17, and 21, such as `sun.nio.ch.DirectBuffer`, `sun.util.calendar.ZoneInfo`, and `sun.nio.cs.StreamDecoder`, making them invisible when compiling across different versions.

For discussions within the Scala community, see scala/bug#12643, scala/bug#12824, scala/bug#12866,  but this is not a bug.

I have also submitted an issue to the `scala-maven-plugin` community to discuss the possibility of adding additional settings to control the addition of the `-release` option: davidB/scala-maven-plugin#722.

For Apache Spark 4.0, in the short term, I suggest downgrading `scala-maven-plugin` to version 4.7.1 to avoid it automatic adding the `-release` option as a Scala compilation argument. In the long term, we should reduce use of APIs that are not `exports` for compatibility with the `-release` compilation option due to `-target` already deprecated after Scala 2.13.9.

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
- Pass GitHub Actions
- Manual check

run `git revert 656bf36` to revert to using Scala 2.13.11 and run `dev/change-scala-version.sh 2.13` to change Scala to 2.13

1. Run `build/mvn clean install -DskipTests -Pscala-2.13 -X` to check the Scala compilation arguments.

Before

```
[[DEBUG] [zinc] Running cached compiler 1992eaf4 for Scala compiler version 2.13.11
[DEBUG] [zinc] The Scala compiler is invoked with:
  -unchecked
  -deprecation
  -feature
  -explaintypes
  -target:jvm-1.8
  -Wconf:cat=deprecation:wv,any:e
  -Wunused:imports
  -Wconf:cat=scaladoc:wv
  -Wconf:cat=lint-multiarg-infix:wv
  -Wconf:cat=other-nullary-override:wv
  -Wconf:cat=other-match-analysis&site=org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupFunction.catalogFunction:wv
  -Wconf:cat=other-pure-statement&site=org.apache.spark.streaming.util.FileBasedWriteAheadLog.readAll.readFile:wv
  -Wconf:cat=other-pure-statement&site=org.apache.spark.scheduler.OutputCommitCoordinatorSuite.<local OutputCommitCoordinatorSuite>.futureAction:wv
  -Wconf:msg=^(?=.*?method|value|type|object|trait|inheritance)(?=.*?deprecated)(?=.*?since 2.13).+$:s
  -Wconf:msg=^(?=.*?Widening conversion from)(?=.*?is deprecated because it loses precision).+$:s
  -Wconf:msg=Auto-application to \`\(\)\` is deprecated:s
  -Wconf:msg=method with a single empty parameter list overrides method without any parameter list:s
  -Wconf:msg=method without a parameter list overrides a method with a single empty one:s
  -Wconf:cat=deprecation&msg=procedure syntax is deprecated:e
  -Wconf:cat=unchecked&msg=outer reference:s
  -Wconf:cat=unchecked&msg=eliminated by erasure:s
  -Wconf:msg=^(?=.*?a value of type)(?=.*?cannot also be).+$:s
  -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBase.scala:s
  -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBaseOps.scala:s
  -Wconf:msg=Implicit definition should have explicit type:s
  -release
  8
  -bootclasspath
...
```

After

```
[DEBUG] [zinc] Running cached compiler 72dd4888 for Scala compiler version 2.13.11
[DEBUG] [zinc] The Scala compiler is invoked with:
  -unchecked
  -deprecation
  -feature
  -explaintypes
  -target:jvm-1.8
  -Wconf:cat=deprecation:wv,any:e
  -Wunused:imports
  -Wconf:cat=scaladoc:wv
  -Wconf:cat=lint-multiarg-infix:wv
  -Wconf:cat=other-nullary-override:wv
  -Wconf:cat=other-match-analysis&site=org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupFunction.catalogFunction:wv
  -Wconf:cat=other-pure-statement&site=org.apache.spark.streaming.util.FileBasedWriteAheadLog.readAll.readFile:wv
  -Wconf:cat=other-pure-statement&site=org.apache.spark.scheduler.OutputCommitCoordinatorSuite.<local OutputCommitCoordinatorSuite>.futureAction:wv
  -Wconf:msg=^(?=.*?method|value|type|object|trait|inheritance)(?=.*?deprecated)(?=.*?since 2.13).+$:s
  -Wconf:msg=^(?=.*?Widening conversion from)(?=.*?is deprecated because it loses precision).+$:s
  -Wconf:msg=Auto-application to \`\(\)\` is deprecated:s
  -Wconf:msg=method with a single empty parameter list overrides method without any parameter list:s
  -Wconf:msg=method without a parameter list overrides a method with a single empty one:s
  -Wconf:cat=deprecation&msg=procedure syntax is deprecated:e
  -Wconf:cat=unchecked&msg=outer reference:s
  -Wconf:cat=unchecked&msg=eliminated by erasure:s
  -Wconf:msg=^(?=.*?a value of type)(?=.*?cannot also be).+$:s
  -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBase.scala:s
  -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBaseOps.scala:s
  -Wconf:msg=Implicit definition should have explicit type:s
  -target:8
  -bootclasspath
  ...
```

After downgrading the version, the `-release` option should no longer appear in the compilation arguments.

2. Maven can build the project with Java 17 without the issue described in #41943. And after this pr, we can re-upgrade Scala 2.13 to Scala 2.13.11.

### Was this patch authored or co-authored using generative AI tooling?
No

Closes #42899 from LuciferYang/SPARK-45144.

Authored-by: yangjie01 <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
szehon-ho pushed a commit to szehon-ho/spark that referenced this issue Aug 7, 2024
This pr downgrade `scala-maven-plugin` to version 4.7.1 to avoid it automatically adding the `-release` option as a Scala compilation argument.

The `scala-maven-plugin` versions 4.7.2 and later will try to automatically append the `-release` option as a Scala compilation argument when it is not specified by the user:

1. 4.7.2 and 4.8.0: try to add the `-release` option for Scala versions 2.13.9 and higher.
2. 4.8.1: try to append the `-release` option for Scala versions 2.12.x/2.13.x/3.1.1, and append `-java-output-version` for Scala 3.1.2.

The addition of the `-release` option has caused issues mentioned in SPARK-44376 | apache#41943 and apache#40442 (comment). This is because the `-release` option has stronger compilation restrictions than `-target`, ensuring not only bytecode format, but also that the API used in the code is compatible with the specified version of Java. However, many APIs in the `sun.*` package are not `exports` in Java 11, 17, and 21, such as `sun.nio.ch.DirectBuffer`, `sun.util.calendar.ZoneInfo`, and `sun.nio.cs.StreamDecoder`, making them invisible when compiling across different versions.

For discussions within the Scala community, see scala/bug#12643, scala/bug#12824, scala/bug#12866,  but this is not a bug.

I have also submitted an issue to the `scala-maven-plugin` community to discuss the possibility of adding additional settings to control the addition of the `-release` option: davidB/scala-maven-plugin#722.

For Apache Spark 4.0, in the short term, I suggest downgrading `scala-maven-plugin` to version 4.7.1 to avoid it automatic adding the `-release` option as a Scala compilation argument. In the long term, we should reduce use of APIs that are not `exports` for compatibility with the `-release` compilation option due to `-target` already deprecated after Scala 2.13.9.

No

- Pass GitHub Actions
- Manual check

run `git revert 656bf36` to revert to using Scala 2.13.11 and run `dev/change-scala-version.sh 2.13` to change Scala to 2.13

1. Run `build/mvn clean install -DskipTests -Pscala-2.13 -X` to check the Scala compilation arguments.

Before

```
[[DEBUG] [zinc] Running cached compiler 1992eaf4 for Scala compiler version 2.13.11
[DEBUG] [zinc] The Scala compiler is invoked with:
  -unchecked
  -deprecation
  -feature
  -explaintypes
  -target:jvm-1.8
  -Wconf:cat=deprecation:wv,any:e
  -Wunused:imports
  -Wconf:cat=scaladoc:wv
  -Wconf:cat=lint-multiarg-infix:wv
  -Wconf:cat=other-nullary-override:wv
  -Wconf:cat=other-match-analysis&site=org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupFunction.catalogFunction:wv
  -Wconf:cat=other-pure-statement&site=org.apache.spark.streaming.util.FileBasedWriteAheadLog.readAll.readFile:wv
  -Wconf:cat=other-pure-statement&site=org.apache.spark.scheduler.OutputCommitCoordinatorSuite.<local OutputCommitCoordinatorSuite>.futureAction:wv
  -Wconf:msg=^(?=.*?method|value|type|object|trait|inheritance)(?=.*?deprecated)(?=.*?since 2.13).+$:s
  -Wconf:msg=^(?=.*?Widening conversion from)(?=.*?is deprecated because it loses precision).+$:s
  -Wconf:msg=Auto-application to \`\(\)\` is deprecated:s
  -Wconf:msg=method with a single empty parameter list overrides method without any parameter list:s
  -Wconf:msg=method without a parameter list overrides a method with a single empty one:s
  -Wconf:cat=deprecation&msg=procedure syntax is deprecated:e
  -Wconf:cat=unchecked&msg=outer reference:s
  -Wconf:cat=unchecked&msg=eliminated by erasure:s
  -Wconf:msg=^(?=.*?a value of type)(?=.*?cannot also be).+$:s
  -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBase.scala:s
  -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBaseOps.scala:s
  -Wconf:msg=Implicit definition should have explicit type:s
  -release
  8
  -bootclasspath
...
```

After

```
[DEBUG] [zinc] Running cached compiler 72dd4888 for Scala compiler version 2.13.11
[DEBUG] [zinc] The Scala compiler is invoked with:
  -unchecked
  -deprecation
  -feature
  -explaintypes
  -target:jvm-1.8
  -Wconf:cat=deprecation:wv,any:e
  -Wunused:imports
  -Wconf:cat=scaladoc:wv
  -Wconf:cat=lint-multiarg-infix:wv
  -Wconf:cat=other-nullary-override:wv
  -Wconf:cat=other-match-analysis&site=org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupFunction.catalogFunction:wv
  -Wconf:cat=other-pure-statement&site=org.apache.spark.streaming.util.FileBasedWriteAheadLog.readAll.readFile:wv
  -Wconf:cat=other-pure-statement&site=org.apache.spark.scheduler.OutputCommitCoordinatorSuite.<local OutputCommitCoordinatorSuite>.futureAction:wv
  -Wconf:msg=^(?=.*?method|value|type|object|trait|inheritance)(?=.*?deprecated)(?=.*?since 2.13).+$:s
  -Wconf:msg=^(?=.*?Widening conversion from)(?=.*?is deprecated because it loses precision).+$:s
  -Wconf:msg=Auto-application to \`\(\)\` is deprecated:s
  -Wconf:msg=method with a single empty parameter list overrides method without any parameter list:s
  -Wconf:msg=method without a parameter list overrides a method with a single empty one:s
  -Wconf:cat=deprecation&msg=procedure syntax is deprecated:e
  -Wconf:cat=unchecked&msg=outer reference:s
  -Wconf:cat=unchecked&msg=eliminated by erasure:s
  -Wconf:msg=^(?=.*?a value of type)(?=.*?cannot also be).+$:s
  -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBase.scala:s
  -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBaseOps.scala:s
  -Wconf:msg=Implicit definition should have explicit type:s
  -target:8
  -bootclasspath
  ...
```

After downgrading the version, the `-release` option should no longer appear in the compilation arguments.

2. Maven can build the project with Java 17 without the issue described in apache#41943. And after this pr, we can re-upgrade Scala 2.13 to Scala 2.13.11.

No

Closes apache#42899 from LuciferYang/SPARK-45144.

Authored-by: yangjie01 <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants