Skip to content

Kafka Engine: "Can't get assignment" with Strimizi external listener and Materialized Views #1270

@bhaskarblur

Description

@bhaskarblur

Description

ClickHouse Kafka Engine fails to get partition assignments when consuming from a Strimizi-managed Kafka cluster on Kubernetes, even after fixing broker advertised.listeners configuration. Materialized Views attached to Kafka tables are created but never trigger inserts despite successful message consumption.
Consumer memebers for the clickhouse table engine consumer groups show only 6-7 and they drop to 2 as well.
Normal connection to Kafka from CLickhouse works well. Even the topic and partitions exists too.

Environment

  • ClickHouse Version: 25.3.6.10034.altinitystable (Altinity)
  • Deployment: 3-node EC2 cluster
  • Kafka: Strimizi Kafka Operator 0.x on Kubernetes
  • Kafka Version: 4.0.0
  • Schema Registry: Confluent Schema Registry
  • Format: Avro with Confluent wire format
  • Kafka connectivity: ✅ Port 9094 reachable
  • Schema Registry: ✅ Working, schema validation passed
  • Consumer group creation: ✅ 7 consumers created on each node (21 total)
  • Partition assignment: ⚠️ Only 2/20 partitions assigned despite 7 consumers per node
  • Materialized View: ⚠️ Attached to Kafka table ("Started streaming to 1 attached views" logged) but never triggers inserts
  • Data flow: ❌ 0 rows inserted to target table despite successful consumer creation

System table output

SELECT count() as consumers, sum(is_currently_used) as active, sum(length(assignments.partition_id)) as partitions_assigned
FROM system.kafka_consumers WHERE table = 'tdf_data_kafka';

Result: 7 consumers, 2 active, 2 partitions assigned

MV status

SELECT name, dependencies_table FROM system.tables WHERE name = 'tdf_data_consumer';

Result: dependencies_table: [] (empty - indicates MV not properly linked to Kafka table)

Expected Behavior

All 20 partitions should be distributed across 21 consumers (7 per node × 3 nodes)
Materialized View should trigger automatically when messages are consumed
Data should flow from Kafka topic → Kafka table → MV → target table

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions