Description
In Spark SQL, the method "canUpCast" returns true iff we can safely up-cast the `from` type to `to` type without any truncating or precision lose or possible runtime failures.
Meanwhile, DecimalType(10, 0) is considered as "canUpCast" to Integer type. This is wrong, since casting 9000000000BD as Integer type will overflow.
As a result:
- The optimizer rule SimplifyCasts replies on the method "canUpCast" and it will mistakenly convert "cast(cast(9000000000BD as int) as long)" as "cast(9000000000BD as long)"
- The STRICT store assignment policy relies on this method too. With the policy enabled, inserting 9000000000BD into integer columns will pass compiling time check and insert an unexpected value 410065408.