Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Incomplete
-
2.4.3
-
None
-
Darwin <hostname.local> 18.7.0 Darwin Kernel Version 18.7.0: Thu Jun 20
PDT 2019; root:xnu-4903.270.47~4/RELEASE_X86_64 x86_64
Description
When creating a spark context running on a local host getting connection refused error, however using the actual host name for example: "spark://myhostname.local:7077" works
performance-meter { spark { appname = "test-harness" master = "spark://localhost:7077" } }
val configRoot = "performance-meter" val sparkSession = SparkSession.builder .appName(conf.getString(s"${configRoot}.spark.appname")) .master(conf.getString(s"${configRoot}.spark.master"))
This appears to be due to some Macs having multiple network interfaces, at least is the case on my Mac. Recommended fix that seems to work locally:
in file: /usr/local/Cellar/apache-spark/2.4.3/libexec/sbin/start-master.sh & /usr/local/Cellar/apache-spark/2.4.3/libexec/sbin/start-slaves.sh add a case section for "Darwin"
if [ "$SPARK_MASTER_HOST" = "" ]; then case `uname` in (SunOS) SPARK_MASTER_HOST="`/usr/sbin/check-hostname | awk '{print $NF}'`" ;; (Darwin) SPARK_MASTER_HOST="localhost" # 13-Sep-2019 alexshagiev add Mac (Darwin) case to ensure spark binds on local host interface instead of external interface. ;; (*) SPARK_MASTER_HOST="`hostname -f`" ;; esac fi