Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-9884

Submission scripts are broken on ec2 clusters started with spark-ec2 with private IP's

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Critical
    • Resolution: Won't Fix
    • 1.4.1
    • None
    • EC2
    • None

    Description

      I've started some fresh clusters using spark-ec2 on VPC's that only have private IP addresses. This worked fine with some PR's I made to spark-ec2 against 1.3 that are now in 1.4. However, now all job submission is broken.

      ./bin/pyspark and ./bin/sparkR work fine. However, ./bin/spark-submit, ./bin/spark-class, ./bin/spark-shell, and ./bin/spark-sql are all broken. They return immediately to the command line with no error and no output.

      I've tracked it down to some environment stuff in this diff:

      https://github.com/amplab/spark-ec2/commit/b21512ef553e382b1bdde48fcb0c7b23a5187616

      I don't entirely understand the source (related to next sourcing of bash scripts and environment stuff) but I have a fix that I'll submit in a bit.

      Attachments

        Activity

          People

            Unassigned Unassigned
            mdagost Michelangelo D'Agostino
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: