Details
-
Bug
-
Status: Resolved
-
Minor
-
Resolution: Incomplete
-
2.3.1
-
None
Description
Spark 2.3.1 has a Python 3 incompatibility when requesting a Conf value from SparkSession when you give non-string default values. Reproduce via SparkSession call:
spark.conf.get("myConf", False)
This gives the error:
>>> spark.conf.get("myConf", False) Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/Users/josephkb/work/spark-bin/spark-2.3.1-bin-hadoop2.7/python/pyspark/sql/conf.py", line 51, in get self._checkType(default, "default") File "/Users/josephkb/work/spark-bin/spark-2.3.1-bin-hadoop2.7/python/pyspark/sql/conf.py", line 62, in _checkType if not isinstance(obj, str) and not isinstance(obj, unicode): *NameError: name 'unicode' is not defined*
The offending line in Spark in branch-2.3 is: https://github.com/apache/spark/blob/branch-2.3/python/pyspark/sql/conf.py which uses the value unicode which is not available in Python 3.