Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-10066

Can't create HiveContext with spark-shell or spark-sql on snapshot

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Duplicate
    • 1.5.0
    • None
    • Spark Shell, SQL
    • None
    • Centos 6.6

    Description

      Built the 1.5.0-preview-20150812 with the following:

      ./make-distribution.sh -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver -Psparkr -DskipTests

      Starting spark-shell or spark-sql returns the following error:
      java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx------
      at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:612) .... [elided]
      at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)

      It's trying to create a new HiveContext. Running pySpark or sparkR works and creates a HiveContext successfully. SqlContext can be created successfully with any shell.

      I've tried changing permissions on that HDFS directory (even as far as making it world-writable) without success. Tried changing SPARK_USER and also running spark-shell as different users without success.

      This works on same machine on 1.4.1 and on earlier pre-release versions of Spark 1.5.0 (same make-distribution parms) sucessfully. Just trying the snapshot...

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              bobbeauch Robert Beauchemin
              Votes:
              0 Vote for this issue
              Watchers:
              6 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: