开发者社区> 问答> 正文

Spark Structured Streaming error读取字段'topic_metadata'时出错

我正在运行spark 2.4.0和Kafka 0.10.2

var streamingInputDF =
spark.readStream

.format("kafka")
.option("kafka.bootstrap.servers", "localhost:9092")
.option("subscribe", "twitter-topic")
.load()

控制台writeStream:

val activityQuery = streamingInputDF.writeStream
.format("console")
.outputMode("append")
.start()

activityQuery.awaitTermination()
但是,当我启动控制台时,writeStream我得到以下异常

org.apache.spark.sql.streaming.StreamingQueryException: Query [id = d21cd9b4-7f51-4f5f-acbf-943dfaaeb7e5, runId = c2b2c58d-7afe-4ca5-bc36-6a3f496c19b3] terminated with exception: Error reading field 'topic_metadata': Error reading array of size 881783, only 41 bytes available
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution

$$ runStream(StreamExecution.scala:295) at org.apache.spark.sql.execution.streaming.StreamExecution $$

anon$1.run(StreamExecution.scala:189)
Caused by: org.apache.kafka.common.protocol.types.SchemaException: Error reading field 'topic_metadata': Error reading array of size 881783, only 41 bytes available
at org.apache.kafka.common.protocol.types.Schema.read(Schema.java:73)
at org.apache.kafka.clients.NetworkClient.parseResponse(NetworkClient.java:380)
at org.apache.kafka.clients.NetworkClient.handleCompletedReceives(NetworkClient.java:449)
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:269)
at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.clientPoll(ConsumerNetworkClient.java:360)

展开
收起
社区小助手 2018-12-06 15:51:39 2634 0
1 条回答
写回答
取消 提交回答
  • 社区小助手是spark中国社区的管理员,我会定期更新直播回顾等资料和文章干货,还整合了大家在钉群提出的有关spark的问题及回答。

    我将kafka-clients-0.10.2.2.jar添加到spark-submit命令行,这个错误就消失了

    2019-07-17 23:18:36
    赞同 展开评论 打赏
问答排行榜
最热
最新

相关电子书

更多
Hybrid Cloud and Apache Spark 立即下载
Scalable Deep Learning on Spark 立即下载
Comparison of Spark SQL with Hive 立即下载