Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-25057

Unable to start spark on master URL

    XMLWordPrintableJSON

Details

    • Question
    • Status: Resolved
    • Major
    • Resolution: Incomplete
    • 2.2.2
    • None
    • Java API
    • Spring-boot, Spark 2.2.2, Cassandra 3.5.1

    • Important

    Description

      I am building a REST microservice with spark and Cassandra and I provided spark master value as local and its running fine.

      But when I tried to provide spark master URL as " spark://ip:7077 " then it's showing the following error when I start the rest service:

      Caused by: java.io.IOException: Failed to send RPC 8950209836630764258 to /98.8.150.125:7077: java.lang.AbstractMethodError: org.apache.spark.network.protocol.MessageWithHeader.touch(Ljava/lang/Object;)Lio/netty/util/ReferenceCounted;
      at org.apache.spark.network.client.TransportClient.lambda$sendRpc$2(TransportClient.java:237) ~[spark-network-common_2.11-2.2.2.jar!/:2.2.2]
      at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.util.concurrent.DefaultPromise.notifyListeners0(DefaultPromise.java:500) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:479) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:122) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.util.internal.PromiseNotificationUtil.tryFailure(PromiseNotificationUtil.java:64) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.channel.AbstractChannelHandlerContext.notifyOutboundHandlerException(AbstractChannelHandlerContext.java:837) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:740) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:816) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:723) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.handler.timeout.IdleStateHandler.write(IdleStateHandler.java:305) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.channel.AbstractChannelHandlerContext.access$1900(AbstractChannelHandlerContext.java:38) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.write(AbstractChannelHandlerContext.java:1089) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.channel.AbstractChannelHandlerContext$WriteAndFlushTask.write(AbstractChannelHandlerContext.java:1136) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:1078) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:462) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      ... 1 common frames omitted
      
      
      
      Caused by: java.lang.AbstractMethodError: org.apache.spark.network.protocol.MessageWithHeader.touch(Ljava/lang/Object;)Lio/netty/util/ReferenceCounted;
      at io.netty.util.ReferenceCountUtil.touch(ReferenceCountUtil.java:73) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.channel.DefaultChannelPipeline.touch(DefaultChannelPipeline.java:107) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:810) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:723) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:111) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
      ... 16 common frames omitted
      

      I am using following spark and cassandra dependencies for my rest service:

      <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.11</artifactId>
      <version>${spark.version}</version>
      </dependency>
      
      <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-sql_2.11</artifactId>
      <version>${spark.version}</version>
      </dependency>
      
      <dependency>
      <groupId>com.datastax.spark</groupId>
      <artifactId>spark-cassandra-connector_2.11</artifactId>
      <version>${spark.cassandra.connector.version}</version>
      </dependency>
      
      <dependency>
      <groupId>com.datastax.spark</groupId>
      <artifactId>spark-cassandra-connector-java_2.11</artifactId>
      <version>${spark.cassandra.connector.java.version}</version>
      </dependency>

      I also tried to provide spark master URL in spark-env.sh in spark conf, but its no use. Has anyone faced similar issue before? Any help is appreciated.

      Attachments

        Activity

          People

            Unassigned Unassigned
            shivam07 Shivam Gupta
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: