Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-3140

PySpark start-up throws confusing exception

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Critical
    • Resolution: Fixed
    • 1.0.2
    • 1.1.0
    • PySpark
    • None

    Description

      Currently we read the pyspark port through stdout of the spark-submit subprocess. However, if there is stdout interference, e.g. spark-submit echoes something unexpected to stdout, we print the following:

      Exception: Launching GatewayServer failed! (Warning: unexpected output detected.)
      

      This condition is fine. However, we actually throw the same exception if there is no output from the subprocess as well. This is very confusing because it implies that the subprocess is outputting something (possibly whitespace, which is not visible) when it's actually not.

      Attachments

        Issue Links

          Activity

            People

              andrewor14 Andrew Or
              andrewor14 Andrew Or
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: