Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-47133

java.lang.NullPointerException: Missing SslContextFactory when accessing Worker WebUI from Master as reverse proxy with SSL enabled

    XMLWordPrintableJSON

Details

    • Question
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 3.5.0
    • None
    • Web UI
    • None
      • We are running Spark in stand-alone mode, on Kubernetes.
      • The containers are based on Debian 11 (minideb)
      • The Spark version is 3.5

    Description

      Hi,

       

      We are encountering the error described here below.

      If SSL/TLS is enabled on both, Master and Worker, it is not possible to access the WebUI of the latter from the former configured as reverse proxy. The returned error is the the following.

      HTTP ERROR 500 java.lang.NullPointerException: Missing SslContextFactory
      
      URI:/proxy/worker-20240222171308-10.113.3.1-34959
      STATUS:500
      MESSAGE:java.lang.NullPointerException: Missing SslContextFactory
      SERVLET:org.apache.spark.ui.JettyUtils$$anon$3-7d068d54
      CAUSED BY:java.lang.NullPointerException: Missing SslContextFactory
      
      Caused by:java.lang.NullPointerException: Missing SslContextFactory
      	at java.base/java.util.Objects.requireNonNull(Objects.java:235)
      	at org.sparkproject.jetty.io.ssl.SslClientConnectionFactory.<init>(SslClientConnectionFactory.java:57)
      	at org.sparkproject.jetty.client.HttpClient.newSslClientConnectionFactory(HttpClient.java:1273)
      	at org.sparkproject.jetty.client.HttpClient.newSslClientConnectionFactory(HttpClient.java:1279)
      	at org.sparkproject.jetty.client.HttpDestination.newSslClientConnectionFactory(HttpDestination.java:209)
      	at org.sparkproject.jetty.client.HttpDestination.newSslClientConnectionFactory(HttpDestination.java:215)
      	at org.sparkproject.jetty.client.HttpDestination.<init>(HttpDestination.java:100)
      	at org.sparkproject.jetty.client.PoolingHttpDestination.<init>(PoolingHttpDestination.java:25)
      	at org.sparkproject.jetty.client.http.HttpDestinationOverHTTP.<init>(HttpDestinationOverHTTP.java:32)
      	at org.sparkproject.jetty.client.http.HttpClientTransportOverHTTP.newHttpDestination(HttpClientTransportOverHTTP.java:54)
      	at org.sparkproject.jetty.client.HttpClient.lambda$resolveDestination$0(HttpClient.java:597)
      	at java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1916)
      	at org.sparkproject.jetty.client.HttpClient.resolveDestination(HttpClient.java:593)
      	at org.sparkproject.jetty.client.HttpClient.resolveDestination(HttpClient.java:571)
      	at org.sparkproject.jetty.client.HttpClient.send(HttpClient.java:626)
      	at org.sparkproject.jetty.client.HttpRequest.sendAsync(HttpRequest.java:780)
      	at org.sparkproject.jetty.client.HttpRequest.send(HttpRequest.java:767)
      	at org.sparkproject.jetty.proxy.AbstractProxyServlet.sendProxyRequest(AbstractProxyServlet.java:618)
      	at org.sparkproject.jetty.proxy.ProxyServlet.service(ProxyServlet.java:114)
      	at javax.servlet.http.HttpServlet.service(HttpServlet.java:590)
      	at org.sparkproject.jetty.servlet.ServletHolder.handle(ServletHolder.java:799)
      	at org.sparkproject.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656)
      	at org.apache.spark.ui.HttpSecurityFilter.doFilter(HttpSecurityFilter.scala:95)
      	at org.sparkproject.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
      	at org.sparkproject.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626)
      	at org.sparkproject.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552)
      	at org.sparkproject.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
      	at org.sparkproject.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440)
      	at org.sparkproject.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
      	at org.sparkproject.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505)
      	at org.sparkproject.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
      	at org.sparkproject.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355)
      	at org.sparkproject.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
      	at org.sparkproject.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:772)
      	at org.sparkproject.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:234)
      	at org.sparkproject.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
      	at org.sparkproject.jetty.server.Server.handle(Server.java:516)
      	at org.sparkproject.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)
      	at org.sparkproject.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)
      	at org.sparkproject.jetty.server.HttpChannel.handle(HttpChannel.java:479)
      	at org.sparkproject.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)
      	at org.sparkproject.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
      	at org.sparkproject.jetty.io.FillInterest.fillable(FillInterest.java:105)
      	at org.sparkproject.jetty.io.ssl.SslConnection$DecryptedEndPoint.onFillable(SslConnection.java:555)
      	at org.sparkproject.jetty.io.ssl.SslConnection.onFillable(SslConnection.java:410)
      	at org.sparkproject.jetty.io.ssl.SslConnection$2.succeeded(SslConnection.java:164)
      	at org.sparkproject.jetty.io.FillInterest.fillable(FillInterest.java:105)
      	at org.sparkproject.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)
      	at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)
      	at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)
      	at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)
      	at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)
      	at org.sparkproject.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)
      	at org.sparkproject.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)
      	at org.sparkproject.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)
      	at java.base/java.lang.Thread.run(Thread.java:840)
       

      The master has the following configuration file.

      spark.network.crypto.keyLength 128
      spark.metrics.conf.*.sink.prometheusServlet.class org.apache.spark.metrics.sink.PrometheusServlet
      spark.metrics.conf.*.sink.prometheusServlet.path /metrics
      spark.metrics.conf.applications.sink.prometheusServlet.path /metrics
      spark.ssl.needClientAuth false
      spark.ssl.keyStoreType JKS
      spark.network.crypto.enabled true
      spark.metrics.namespace ""
      spark.ssl.trustStore /opt/spark/conf/certs/truststore.jks
      spark.ssl.protocol TLSv1.2
      spark.ssl.keyStore /opt/spark/conf/certs/keystore.jks
      spark.ssl.trustStoreType JKS
      spark.ssl.standalone.port 8480
      spark.ssl.keyPassword the-password
      spark.authenticate.secret the-password 
      spark.ui.reverseProxy true
      spark.metrics.conf.master.sink.prometheusServlet.path /metrics
      spark.ssl.keyStorePassword the-password
      spark.ssl.trustStorePassword the-password 
      spark.ssl.enabled true
      spark.authenticate true
      spark.ui.reverseProxyUrl https://sdwbgn.poc:8080 

      The worker has the following configuration file.

      spark.metrics.conf.*.sink.prometheusServlet.path /metrics
      spark.ssl.trustStorePassword the-password 
      spark.authenticate.secret the-password
      spark.ui.reverseProxy true
      spark.ui.reverseProxyUrl https://sdwbgn.poc:8080
      spark.metrics.conf.master.sink.prometheusServlet.path /metrics
      spark.ssl.trustStore /opt/spark/conf/certs/truststore.jks
      spark.ssl.trustStoreType JKS
      spark.ssl.enabled true
      spark.ssl.keyStoreType JKS
      spark.network.crypto.keyLength 128
      spark.metrics.namespace ""
      spark.metrics.conf.*.sink.prometheusServlet.class org.apache.spark.metrics.sink.PrometheusServlet
      spark.metrics.conf.applications.sink.prometheusServlet.path /metrics
      spark.ssl.protocol TLSv1.2
      spark.authenticate true
      spark.network.crypto.enabled true
      spark.ssl.keyStore /opt/spark/conf/certs/keystore.jks
      spark.ssl.keyPassword the-password 
      spark.ssl.keyStorePassword the-password
      spark.ssl.needClientAuth false
      spark.ssl.standalone.port 8480 

      If we disable SSL/TSL on the Worker, than all works fine, and we can use the Master configured as reverse proxy as expected.

      We are running Spark in stand-alone mode on a Kubernetes. The containers we are using are based on Debian 11 (minideb). The Spark version is 3.5. Do not hesitate to ask further information if required.

      It would look like a bug to us, but before opening a bug ticket, I would like to have some feedback about possible misconfiguration, or other cause for the error.

      Many thanks in advance!

       

      Regards,

      Filippo

      Attachments

        Activity

          People

            Unassigned Unassigned
            fmonari Filippo Monari
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated: