我试图在处理异常时向DLQ发送消息,但是当我从Spring-boot-kafka-streams-binder使用SendToDlqAndContinue时,我一直收到序列化异常
@EnableBinding(ConsumerStreamsWay.KStreamBinding.class)
public class ConsumerStreamsWay {
@Autowired
private SendToDlqAndContinue dlqHandler;
@StreamListener
public void topic3Processor2(@Input("topic3
我已经构建了一个应用程序,它通过kfka接收数据请求,并使用JPA来持久化数据,例如,在每个JPA错误上。
Exception in thread "user-4db12638-e58b-4728-9374-886ef20d0f31-StreamThread-1" org.springframework.dao.DataIntegrityViolationException: could not execute statement; SQL [n/a]; constraint [UK_sb8bbouer5wak8vyiiy4pf2bx]; nested exception is
当将事件发布到kafka主题时,我会遇到以下异常
org.apache.kafka.common.errors.RecordTooLargeException: The message is 2063239 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.
我看了
我正在使用下面的spring云版本,没有在我的pom中为spring-cloud-stream-kafka-binder提供任