Details
-
Umbrella
-
Status: Open
-
Critical
-
Resolution: Unresolved
-
4.0.0
-
None
Description
For now, this issue aims to collect ideas for planning Apache Spark 4.0.0.
We will add more items which will be excluded from Apache Spark 3.5.0 (Feature Freeze: July 16th, 2023).
Spark 1: 2014.05 (1.0.0) ~ 2016.11 (1.6.3) Spark 2: 2016.07 (2.0.0) ~ 2021.05 (2.4.8) Spark 3: 2020.06 (3.0.0) ~ 2026.xx (3.5.x) Spark 4: 2024.06 (4.0.0, NEW)
Attachments
Issue Links
- relates to
-
SPARK-44937 Add SSL/TLS support for RPC and Shuffle communications
- Resolved
-
SPARK-45923 Spark Kubernetes Operator
- Open
-
SPARK-43351 Support Golang in Spark Connect
- Resolved
-
SPARK-43831 Build and Run Spark on Java 21
- Resolved
-
SPARK-44124 Upgrade AWS SDK to v2
- Open
-
SPARK-35801 SPIP: Row-level operations in Data Source V2
- In Progress
-
SPARK-42551 Support more subexpression elimination cases
- In Progress
-
SPARK-45869 Revisit and Improve Spark Standalone Cluster
- Resolved
-
SPARK-44183 Increase PyArrow minimum version to 4.0.0
- Resolved
-
SPARK-47361 Improve JDBC data sources
- Resolved
-
SPARK-43836 Make Scala 2.13 as default Scala version in Spark 3.5
- Closed
-
SPARK-47970 Revisit skipped parity tests for PySpark Connect
- Open
-
SPARK-37935 Migrate onto error classes
- In Progress
-
SPARK-48094 Reduce GitHub Action usage according to ASF project allowance
- Reopened
-
SPARK-45314 Drop Scala 2.12 and make Scala 2.13 by default
- Resolved
-
SPARK-45315 Drop JDK 8/11 and make JDK 17 by default
- Resolved
-
SPARK-47540 SPIP: Pure Python Package (Spark Connect)
- Resolved
-
SPARK-44101 Support pandas 2
- Resolved
-
SPARK-44893 ThreadInfo improvements for monitoring APIs
- Resolved
-
SPARK-45981 Improve Python language test coverage
- Resolved
-
SPARK-47046 Apache Spark 4.0.0 Dependency Audit and Cleanup
- Resolved