Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-30222

Still getting KafkaConsumer cache hitting max capacity of 64, removing consumer for CacheKe

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Invalid
    • 2.4.1
    • None
    • Structured Streaming
    • None

    Description

      Me using spark-sql-2.4.1 version with Kafka 0.10 v.

      While I try to consume data by consumer. it gives error below even after setting 

      .option("spark.sql.kafkaConsumerCache.capacity",128)

       

      Dataset<Row> df = sparkSession

             .readStream()

             .format("kafka")

             .option("kafka.bootstrap.servers", SERVERS)

             .option("subscribe", TOPIC) }}{{

             .option("spark.sql.kafkaConsumerCache.capacity",128)   

      {{ }}

             .load();

      {{}}

      {{}}

      {{}}

      Attachments

        Activity

          People

            Unassigned Unassigned
            BdLearner Shyam
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: