Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-15256

One session close and delete resourceDir, cause others open failed

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Not A Problem
    • None
    • None
    • Clients
    • None

    Description

      resourceDir is shared to clients. When one connected client closes, it will delete resourceDir. At the same time, other clients are opening session, they will failed.

      Exception is below:

      Error opening session: | org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:536)
      java.lang.RuntimeException: ExitCodeException exitCode=1: chmod: cannot access `/opt/huawei/Bigdata/tmp/spark/dlresources': No such file or directory

      at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:528)
      at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:477)
      at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:229)
      at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:191)
      at org.apache.spark.sql.hive.client.ClientWrapper.newSession(ClientWrapper.scala:1053)
      at org.apache.spark.sql.hive.HiveContext.newSession(HiveContext.scala:93)
      at org.apache.spark.sql.hive.thriftserver.SparkSQLSessionManager.openSession(SparkSQLSessionManager.scala:89)
      at org.apache.hive.service.cli.CLIService.openSessionWithImpersonation(CLIService.java:189)
      at org.apache.hive.service.cli.thrift.ThriftCLIService.getSessionHandle(ThriftCLIService.java:654)
      at org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:522)
      at org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1257)
      at org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1242)
      at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
      at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
      at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:690)
      at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      at java.lang.Thread.run(Thread.java:745)
      Caused by: ExitCodeException exitCode=1: chmod: cannot access `/opt/huawei/Bigdata/tmp/spark/dlresources': No such file or directory

      at org.apache.hadoop.util.Shell.runCommand(Shell.java:561)
      at org.apache.hadoop.util.Shell.run(Shell.java:472)
      at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:738)
      at org.apache.hadoop.util.Shell.execCommand(Shell.java:831)
      at org.apache.hadoop.util.Shell.execCommand(Shell.java:814)
      at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:744)
      at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:502)
      at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:542)
      at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:520)
      at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:340)
      at org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:656)
      at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:584)
      at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:514)
      ... 18 more

      Attachments

        Activity

          People

            Unassigned Unassigned
            meiyoula meiyoula
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: