Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-18411

Unable to Find LoginModel Class using IBM Java openJ9 version 8.0.332.0

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 3.3.4
    • None
    • security
    • None
    • development

    Description

      Hi,

      I am using Spark v.3.3.0 and Java version IBM Semeru 8.0.332.0.

      When I run my Spark Job I get the following exception:

      org.apache.hadoop.security.KerberosAuthException: failure to login: javax.security.auth.login.LoginException: unable to find LoginModule class: 
      com.ibm.security.auth.module.JAASLoginModule
          at org.apache.hadoop.security.UserGroupInformation.doSubjectLogin(UserGroupInformation.java:1986)
          at org.apache.hadoop.security.UserGroupInformation.createLoginUser(UserGroupInformation.java:719)
          at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:669)
          at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:579)
          at org.apache.spark.util.Utils$.$anonfun$getCurrentUserName$1(Utils.scala:2561)
          at scala.Option.getOrElse(Option.scala:138)
          at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2561)
          at org.apache.spark.SparkContext.<init>(SparkContext.scala:316)
          at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2704)
          at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:953)
          at scala.Option.getOrElse(Option.scala:138)

      This looks similar to a previously reported error that has been fixed: https://issues.apache.org/jira/browse/HADOOP-17971

      N.B. The exception I am getting does not contain 'org.apache.hadoop.shaded" in the package name, whereas in HADOOP-17971 it does: (org.apache.hadoop.shaded.com.ibm.security.auth.module.JAASLoginModule).

       

      Spark spark-core_2.12 library contains two Hadoop dependencies:

      hadoop-client-api:jar:3.3.2:compile

      hadoop-client-runtime:jar:3.3.2:compile

      After getting the exception, I tried excluding those components from the Spark dependency in my pom.xml, and explicitly defined them as dependencies. I tried versions 3.3.4 and 3.3.3 but I still get the same error.

       

      N.B. I don't get this exception with Java version IBM Semeru 8.0.312.0

      I can move this to a Spark issue if this isn't the correct place to post it.

      Thanks,

      Steve

       

      Attachments

        Activity

          People

            Unassigned Unassigned
            StevenC Steve Chong
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated: