我试图运行示例目录中给出的python火花流作业-
Counts words in UTF8 encoded, '\n' delimited text received from the networkexample external/kafka-assembly/target/scala-*/spark-streaming-kafka
但最近,我们向现有的flink作业添加了一个新主题,它在启动时立即开始失败,并出现以下根错误: at org.apache.kafka.common.record.DefaultRecordBatch.compressedIterator(DefaultRecordBatch.java:256)
at