Debugging setups with the Alpakka Kafka Connector will be required at times. This page collects a few ideas to start out with in case the connector does not behave as you expected.

Logging with SLF4J

Akka, Akka Streams and thus the Alpakka Kafka Connector support SLF4J logging API by adding Akka’s SLF4J module and an SLF4J compatible logging framework, eg. Logback.

The Kafka client library used by the Alpakka Kafka connector uses SLF4J, as well.

val AkkaVersion = "2.9.3"
libraryDependencies ++= Seq(
  "com.typesafe.akka" %% "akka-slf4j" % AkkaVersion,
  "ch.qos.logback" % "logback-classic" % "1.2.3"
def versions = [
  AkkaVersion: "2.9.3",
  ScalaBinary: "2.13"
dependencies {
  implementation "com.typesafe.akka:akka-slf4j_${versions.ScalaBinary}:${versions.AkkaVersion}"
  implementation "ch.qos.logback:logback-classic:1.2.3"

To enable Akka SLF4J logging, configure Akka in application.conf as below. Refer to the Akka documentation for details.

akka {
  loggers = ["akka.event.slf4j.Slf4jLogger"]
  loglevel = "DEBUG"
  logging-filter = "akka.event.slf4j.Slf4jLoggingFilter"

Receive logging

In case you’re debugging the internals in the Kafka Consumer actor, you might want to enable receive logging to see all messages it receives. To lower the log message volume, change the Kafka poll interval to something larger, eg. 300 ms.

akka {
  actor {
    debug.receive = true
  kafka.consumer {
    poll-interval = 300ms
Found an error in this documentation? The source code for this page can be found here. Please feel free to edit and contribute a pull request.