Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-25983

spark-sql-kafka-0-10 no longer works with Kafka 0.10.0

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Duplicate
    • 2.4.0
    • None
    • SQL
    • None

    Description

      Package org.apache.spark:spark-sql-kafka-0-10_2.11:2.4.0 is no longer compatible with org.apache.kafka:kafka_2.11:0.10.0.1.

      When both packages are used in the same project, the following exception occurs:

      java.lang.NoClassDefFoundError: org/apache/kafka/common/protocol/SecurityProtocol
       at kafka.server.Defaults$.<init>(KafkaConfig.scala:125)
       at kafka.server.Defaults$.<clinit>(KafkaConfig.scala)
       at kafka.log.Defaults$.<init>(LogConfig.scala:33)
       at kafka.log.Defaults$.<clinit>(LogConfig.scala)
       at kafka.log.LogConfig$.<init>(LogConfig.scala:152)
       at kafka.log.LogConfig$.<clinit>(LogConfig.scala)
       at kafka.server.KafkaConfig$.<init>(KafkaConfig.scala:265)
       at kafka.server.KafkaConfig$.<clinit>(KafkaConfig.scala)
       at kafka.server.KafkaConfig.<init>(KafkaConfig.scala:759)
       at kafka.server.KafkaConfig.<init>(KafkaConfig.scala:761)
      

       

      This exception is caused by incompatible dependency pulled by Spark: org.apache.kafka:kafka-clients_2.11:2.0.0.
       

      Following workaround could be used to resolve the problem in my project:

      dependencyOverrides += "org.apache.kafka" % "kafka-clients" % "0.10.0.1"
      

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              nonsleepr Alexander Bessonov
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: