Details
-
Sub-task
-
Status: Closed
-
Major
-
Resolution: Duplicate
-
3.2.0
-
None
-
None
Description
I tried to run a Spark pipeline using the most recent 3.2.0-SNAPSHOT with Spark 2.12.4 on Java 17 and I found this exception:
java.lang.ExceptionInInitializerError at org.apache.spark.unsafe.array.ByteArrayMethods.<clinit> (ByteArrayMethods.java:54) at org.apache.spark.internal.config.package$.<clinit> (package.scala:1149) at org.apache.spark.SparkConf$.<clinit> (SparkConf.scala:654) at org.apache.spark.SparkConf.contains (SparkConf.scala:455) ... Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make private java.nio.DirectByteBuffer(long,int) accessible: module java.base does not "opens java.nio" to unnamed module @110df513 at java.lang.reflect.AccessibleObject.checkCanSetAccessible (AccessibleObject.java:357) at java.lang.reflect.AccessibleObject.checkCanSetAccessible (AccessibleObject.java:297) at java.lang.reflect.Constructor.checkCanSetAccessible (Constructor.java:188) at java.lang.reflect.Constructor.setAccessible (Constructor.java:181) at org.apache.spark.unsafe.Platform.<clinit> (Platform.java:56) at org.apache.spark.unsafe.array.ByteArrayMethods.<clinit> (ByteArrayMethods.java:54) at org.apache.spark.internal.config.package$.<clinit> (package.scala:1149) at org.apache.spark.SparkConf$.<clinit> (SparkConf.scala:654) at org.apache.spark.SparkConf.contains (SparkConf.scala:455)}}
It seems that Java 17 will be more strict about uses of JDK Internals https://openjdk.java.net/jeps/403
Attachments
Issue Links
- duplicates
-
SPARK-36796 Make sql/core and dependent modules all UTs pass on Java 17
- Resolved