Affects Version/s: 2.4.3
Fix Version/s: None
Component/s: Spark Core
master = local[*]
This is a suggestion to reduce (what I think is) some code redundancy.
Looking at this line of code in org.apache.spark.Partitioner:
the first part of the && in the if condition is true if hasMaxPartitioner is non empty, which means that after a scan of rdds we found one with a partitioner whose # of partitions was > 0, and hasMaxPartitioner is the Option wrapped RDD which has the
partitioner with greatest number of partitions.
We then pass the rdd inside hasMaxPartitioner to isEligiblePartitioner where we
set maxPartitions = the length of the longest partitioner in rdds and then check
to see if
log10(maxPartitions) - log10(hasMaxPartitioner.getNumPartitions) < 1
It seems to me that the values inside the two calls to log10 will be equal, so subtracting these will result in 0, which is always < 1.
So... isn't this whole block of code redundant ?
It might even be a bug because the right hand side of the && condition after is always
true, so we never check that
defaultNumPartitions <= hasMaxPartitioner.get.getNumPartitions