Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
3.0.0, 3.0.1
Description
val spark: SparkSession = SparkSession .builder() .master("local") .appName("SparkByExamples.com") .getOrCreate() spark.sparkContext.setLogLevel("ERROR") import spark.sqlContext.implicits._ val df = Seq((-0.0, 0.0)).toDF("neg", "pos") .withColumn("comp", col("neg") < col("pos")) df.show(false) ====== +----+---+----+ |neg |pos|comp| +----+---+----+ |-0.0|0.0|true| +----+---+----+
I think that result should be false.
*Apache Spark 2.4.6 RESULT*
scala> spark.version res0: String = 2.4.6 scala> Seq((-0.0, 0.0)).toDF("neg", "pos").withColumn("comp", col("neg") < col("pos")).show +----+---+-----+ | neg|pos| comp| +----+---+-----+ |-0.0|0.0|false| +----+---+-----+
Attachments
Attachments
Issue Links
- is caused by
-
SPARK-30009 Replace Ordering.Double with Ordering.Double.TotalOrdering
- Resolved
- relates to
-
HIVE-11174 Hive does not treat floating point signed zeros as equal (-0.0 should equal 0.0 according to IEEE floating point spec)
- Closed
- links to