Apache Kudu

The Alpakka Kudu connector supports writing to Apache Kudu tables.

Apache Kudu is a free and open source column-oriented data store in the Apache Hadoop ecosystem.

Project Info: Alpakka Kudu
Artifact
com.lightbend.akka
akka-stream-alpakka-kudu
1.0-M2
JDK versions
OpenJDK 8
Scala versions2.12.7, 2.11.12
JPMS module nameakka.stream.alpakka.kudu
License
Readiness level
Community-driven
Since 0.19, 2018-05-09
Home pagehttps://doc.akka.io/docs/alpakka/current/
API documentation
Forums
Release notesIn the documentation
IssuesGithub issues
Sourceshttps://github.com/akka/alpakka

Artifacts

sbt
libraryDependencies += "com.lightbend.akka" %% "akka-stream-alpakka-kudu" % "1.0-M2"
Maven
<dependency>
  <groupId>com.lightbend.akka</groupId>
  <artifactId>akka-stream-alpakka-kudu_2.12</artifactId>
  <version>1.0-M2</version>
</dependency>
Gradle
dependencies {
  compile group: 'com.lightbend.akka', name: 'akka-stream-alpakka-kudu_2.12', version: '1.0-M2'
}

The table below shows direct dependencies of this module and the second tab shows all libraries it depends on transitively.

Direct dependencies
OrganizationArtifactVersionLicense
com.typesafe.akkaakka-stream_2.122.5.19Apache License, Version 2.0
org.apache.kudukudu-client-tools1.7.1Apache License, Version 2.0
org.scala-langscala-library2.12.7BSD 3-Clause
Dependency tree
com.typesafe.akka    akka-stream_2.12    2.5.19    Apache License, Version 2.0
    com.typesafe.akka    akka-actor_2.12    2.5.19    Apache License, Version 2.0
        com.typesafe    config    1.3.3    Apache License, Version 2.0
        org.scala-lang.modules    scala-java8-compat_2.12    0.8.0    BSD 3-clause
            org.scala-lang    scala-library    2.12.7    BSD 3-Clause
        org.scala-lang    scala-library    2.12.7    BSD 3-Clause
    com.typesafe.akka    akka-protobuf_2.12    2.5.19    Apache License, Version 2.0
        org.scala-lang    scala-library    2.12.7    BSD 3-Clause
    com.typesafe    ssl-config-core_2.12    0.3.6    Apache-2.0
        com.typesafe    config    1.3.3    Apache License, Version 2.0
        org.scala-lang.modules    scala-parser-combinators_2.12    1.1.1    BSD 3-clause
            org.scala-lang    scala-library    2.12.7    BSD 3-Clause
        org.scala-lang    scala-library    2.12.7    BSD 3-Clause
    org.reactivestreams    reactive-streams    1.0.2    CC0
    org.scala-lang    scala-library    2.12.7    BSD 3-Clause
org.apache.kudu    kudu-client-tools    1.7.1    Apache License, Version 2.0
    org.slf4j    slf4j-api    1.7.25    MIT License
org.scala-lang    scala-library    2.12.7    BSD 3-Clause

Configuration

To connect to Kudu

  1. Create a Kudu client and manage its life-cycle
  2. Describe the Kudu Schema (API)
  3. Define a converter function to map your data type to a PartialRow (API)
  4. Specify Kudu CreateTableOptions (API)
  5. Set up Alpakka’s KuduTableSettings (API).
Scala
// Kudu Client
implicit val kuduClient = new KuduClient.KuduClientBuilder(kuduMasterAddress).build
system.registerOnTermination(kuduClient.shutdown())

// Kudu Schema
val cols = List(new ColumnSchema.ColumnSchemaBuilder("key", Type.INT32).key(true).build,
                new ColumnSchema.ColumnSchemaBuilder("value", Type.STRING).build)
val schema = new Schema(cols.asJava)

// Converter function
case class Person(id: Int, name: String)
val kuduConverter: Person => PartialRow = { person =>
  val partialRow = schema.newPartialRow()
  partialRow.addInt(0, person.id)
  partialRow.addString(1, person.name)
  partialRow
}

// Kudu table options
val rangeKeys = List("key")
val createTableOptions = new CreateTableOptions().setNumReplicas(1).setRangePartitionColumns(rangeKeys.asJava)

// Alpakka settings
val kuduTableSettings = KuduTableSettings("test", schema, createTableOptions, kuduConverter)
Java
// Kudu Client
kuduClient = new KuduClient.KuduClientBuilder(kuduMasterAddresses).build();
system.registerOnTermination(
    () -> {
      try {
        kuduClient.shutdown();
      } catch (KuduException e) {
        e.printStackTrace();
      }
    });

// Kudu Schema
List<ColumnSchema> columns = new ArrayList<>(2);
columns.add(new ColumnSchema.ColumnSchemaBuilder("key", Type.INT32).key(true).build());
columns.add(new ColumnSchema.ColumnSchemaBuilder("value", Type.STRING).build());
schema = new Schema(columns);

// Converter function
Function<Person, PartialRow> kuduConverter =
    person -> {
      PartialRow partialRow = schema.newPartialRow();
      partialRow.addInt(0, person.id);
      partialRow.addString(1, person.name);
      return partialRow;
    };

// Kudu table options
List<String> rangeKeys = Collections.singletonList("key");
CreateTableOptions createTableOptions =
    new CreateTableOptions().setNumReplicas(1).setRangePartitionColumns(rangeKeys);

// Alpakka settings
KuduTableSettings<Person> tableSettings =
    KuduTableSettings.create("tablenameSink", schema, createTableOptions, kuduConverter);

Writing to Kudu in a Flow

Scala
val flow: Flow[Person, Person, NotUsed] =
  KuduTable.flow(kuduTableSettings.withTableName("Flow"))

val f = Source(11 to 20)
  .map(i => Person(i, s"zozo_$i"))
  .via(flow)
  .runWith(Sink.fold(0)((a, d) => a + d.id))
Java
Flow<Person, Person, NotUsed> flow =
    KuduTable.flow(tableSettings.withTableName("Flow"), kuduClient);

CompletionStage<List<Person>> run =
    Source.from(Arrays.asList(200, 201, 202, 203, 204))
        .map((i) -> new Person(i, String.format("name_%d", i)))
        .via(flow)
        .toMat(Sink.seq(), Keep.right())
        .run(materializer);

Writing to Kudu with a Sink

Scala
val sink: Sink[Person, Future[Done]] =
  KuduTable.sink(kuduTableSettings.withTableName("Sink"))

val f = Source(1 to 10)
  .map(i => Person(i, s"zozo_$i"))
  .runWith(sink)
Java
final Sink<Person, CompletionStage<Done>> sink =
    KuduTable.sink(tableSettings.withTableName("Sink"), kuduClient);

CompletionStage<Done> o =
    Source.from(Arrays.asList(100, 101, 102, 103, 104))
        .map((i) -> new Person(i, String.format("name %d", i)))
        .runWith(sink, materializer);
Found an error in this documentation? The source code for this page can be found here. Please feel free to edit and contribute a pull request.