Uploaded image for project: 'HBase'
  1. HBase
  2. HBASE-27564

Add default encryption type for MiniKDC to fix failed tests on JDK11+

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • 2.6.0, 2.4.16, 2.5.3
    • None
    • None
    • Reviewed

    Description

      An example of a failed test run with Hadoop2 and JDK17:

       

      [INFO] Running org.apache.hadoop.hbase.coprocessor.TestSecureExport
      [ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 56.87 s <<< FAILURE! - in org.apache.hadoop.hbase.coprocessor.TestSecureExport
      [ERROR] org.apache.hadoop.hbase.coprocessor.TestSecureExport  Time elapsed: 56.862 s  <<< ERROR!
      java.io.IOException: Failed on local exception: java.io.IOException: Couldn't setup connection for tianhang.tang/localhost@EXAMPLE.COM to localhost/127.0.0.1:53756; Host Details : local host is: "Tangs-MacBook-Pro.local/10.2.175.4"; destination host is: "localhost":53756;
          at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:805)
          at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1544)
          at org.apache.hadoop.ipc.Client.call(Client.java:1486)
          at org.apache.hadoop.ipc.Client.call(Client.java:1385)
          at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
          at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118)
          at jdk.proxy2/jdk.proxy2.$Proxy34.getDatanodeReport(Unknown Source)
          at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getDatanodeReport(ClientNamenodeProtocolTranslatorPB.java:653)
          at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
          at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
          at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
          at java.base/java.lang.reflect.Method.invoke(Method.java:568)
          at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
          at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
          at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
          at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
          at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
          at jdk.proxy2/jdk.proxy2.$Proxy35.getDatanodeReport(Unknown Source)
          at org.apache.hadoop.hdfs.DFSClient.datanodeReport(DFSClient.java:2111)
          at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2698)
          at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2742)
          at org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:1723)
          at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:905)
          at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:798)
          at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniDFSCluster(HBaseTestingUtility.java:668)
          at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniDFSCluster(HBaseTestingUtility.java:641)
          at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:1130)
          at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:1105)
          at org.apache.hadoop.hbase.coprocessor.TestSecureExport.beforeClass(TestSecureExport.java:206)
          at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
          at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
          at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
          at java.base/java.lang.reflect.Method.invoke(Method.java:568)
          at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
          at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
          at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
          at org.junit.internal.runners.statements.RunBefores.invokeMethod(RunBefores.java:33)
          at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
          at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
          at org.apache.hadoop.hbase.SystemExitRule$1.evaluate(SystemExitRule.java:38)
          at org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:299)
          at org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:293)
          at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
          at java.base/java.lang.Thread.run(Thread.java:833)
      Caused by: java.io.IOException: Couldn't setup connection for tianhang.tang/localhost@EXAMPLE.COM to localhost/127.0.0.1:53756
          at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:763)
          at java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
          at java.base/javax.security.auth.Subject.doAs(Subject.java:439)
          at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1893)
          at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:734)
          at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:828)
          at org.apache.hadoop.ipc.Client$Connection.access$3700(Client.java:423)
          at org.apache.hadoop.ipc.Client.getConnection(Client.java:1601)
          at org.apache.hadoop.ipc.Client.call(Client.java:1432)
          ... 41 more
      Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Message stream modified (41) - Message stream modified)]
          at jdk.security.jgss/com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:228)
          at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:407)
          at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:629)
          at org.apache.hadoop.ipc.Client$Connection.access$2200(Client.java:423)
          at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:815)
          at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:811)
          at java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
          at java.base/javax.security.auth.Subject.doAs(Subject.java:439)
          at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1893)
          at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:810)
          ... 44 more
      Caused by: GSSException: No valid credentials provided (Mechanism level: Message stream modified (41) - Message stream modified)
          at java.security.jgss/sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:778)
          at java.security.jgss/sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:266)
          at java.security.jgss/sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:196)
          at jdk.security.jgss/com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:209)
          ... 53 more
      Caused by: KrbException: Message stream modified (41) - Message stream modified
          at java.security.jgss/sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:72)
          at java.security.jgss/sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:224)
          at java.security.jgss/sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:235)
          at java.security.jgss/sun.security.krb5.internal.CredentialsUtil.serviceCredsSingle(CredentialsUtil.java:477)
          at java.security.jgss/sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:340)
          at java.security.jgss/sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:314)
          at java.security.jgss/sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:169)
          at java.security.jgss/sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:493)
          at java.security.jgss/sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:700)
          ... 56 more
      Caused by: KrbException: Identifier doesn't match expected value (906)
          at java.security.jgss/sun.security.krb5.internal.KDCRep.init(KDCRep.java:140)
          at java.security.jgss/sun.security.krb5.internal.TGSRep.init(TGSRep.java:65)
          at java.security.jgss/sun.security.krb5.internal.TGSRep.<init>(TGSRep.java:60)
          at java.security.jgss/sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:54)
          ... 64 more 

      That's because hadoop-minikdc lower than 3.0 has compatibility issues with JDK11+, and we can find some useful infos in KAFKA-7338, FLINK-13516 and SPARK-29957: New encryption types of aes128-cts-hmac-sha256-128 and aes256-cts-hmac-sha384-192 (for Kerberos 5) enabled by default were added in Java 11.

      Actually I'm not sure is it suitable to merge into master, because HBase has a rule that JDK11+ could only run with Hadoop3+. Is this just a design rule, or caused by some compatibility issues? If it is not a "rule", maybe we can try to find out the issues and fix them. Wish someone could give me some background infos.

       

      Attachments

        Issue Links

          Activity

            People

              tangtianhang tianhang tang
              tangtianhang tianhang tang
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: