Skip to content

Commit

Permalink
Added kafka spring binder
Browse files Browse the repository at this point in the history
  • Loading branch information
Walter Schultz committed Jun 4, 2024
1 parent 9e94a67 commit 24dc656
Show file tree
Hide file tree
Showing 14 changed files with 862 additions and 0 deletions.
182 changes: 182 additions & 0 deletions docs/user/spring/stream_binder.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,182 @@
Spring Cloud Stream Geomesa Kafka Datastore Binder
==================================================

The Spring Cloud Stream Geomesa Kafka Datastore Binder provides an easy way for Spring Cloud Stream apps to hook into
Geomesa Kafka Datastore to process events.

If you are unfamiliar with Spring Cloud Stream, see the official documentation for an introduction:
https://spring.io/projects/spring-cloud-stream

Input/Output Types
------------

This binder will provide all `KafkaFeatureEvent`s from kafka datastore to your configured function definitions. Each
function will have to do it's own type comparision to see if the event is a `KafkaFeatureEvent.KafkaFeatureChanged`,
`KafkaFeatureEvent.KafkaFeatureRemoved`, or another event type.

The module also ships with a SimpleFeature converter, which allows you to configure function definitions that consume
or produces `SimpleFeature`s and avoid working with `KafkaFeatureEvent`s directly.

Note: The SimpleFeature converter extracts the SimpleFeature out of `KafkaFeatureEvent.KafkaFeatureChanged` events and
ignores all others. Any function definition that consumes SimpleFeatures will miss the
`KafkaFeatureEvent.KafkaFeatureRemoved` and ``KafkaFeatureEvent.KafkaFeatureCleared` messages. And any function
definition that only writes SimpleFeatures will not be able to send those messages.
Configuration
-------------
The configuration options are under spring.cloud.stream.kafka-datastore.binder. This binder will accept any
configuration options for the standard java geomesa kafka-datastore, with the periods ('.') replaced with dashes ('-').
For example, to specify kafka.catalog.topic for the binder, set:
```yaml
spring:
cloud:
stream:
kafka-datastore:
binder:
kafka-catalog-topic: geomesa-catalog-topic
```

For a full list of configuration options, see: https://www.geomesa.org/documentation/stable/user/kafka/usage.html

Examples
--------

Simple Logger App
-----------------
```
@Bean
public Consumer<KafkaFeatureEvent> log() {
return obj -> logger.info(obj.toString());
}
```
```
spring:
cloud:
function:
definition: log
stream:
kafka-datastore.binder:
kafka-brokers: kafka:9092
kafka-zookeepers: zookeeper:2181
function.bindings:
log-in-0: input
bindings:
input:
destination: messages
group: logger
```

Simple Enricher App
-------------------

```
@Bean
public Function<SimpleFeature, SimpleFeature> attachSourceField() {
return sf -> {
sf.setAttribute("source", "un-labelled source");
return sf;
};
}
```
```
spring:
cloud:
function:
definition: attachSourceField
stream:
kafka-datastore.binder:
kafka-brokers: kafka:9092
kafka-zookeepers: zookeeper:2181
function.bindings:
attachSourceField-in-0: input
attachSourceField-out-0: output
bindings:
input:
destination: un-labelled-source-ob
group: sft-reader
output:
destination: observations
group: sft-writer
```

Simple Filter App
-------------------
```
@Bean
public Function<SimpleFeature, SimpleFeature> excludeMoving() {
return sf -> {
if (sf.getAttribute("status").equals("IN_TRANSIT")) {
return null;
}
return sf;
};
}
```
```
spring:
cloud:
function:
definition: filterMoving
stream:
kafka-datastore.binder:
kafka-brokers: kafka:9092
kafka-zookeepers: zookeeper:2181
function.bindings:
filterMoving-in-0: input
filterMoving-out-0: output
bindings:
input:
destination: movingAndUnmovingThings
group: sft-reader
output:
destination: unMovingThings
group: sft-writer
```

Multiple Datastore App
----------------------

In the case of multi-bindings, you simply need to submit override the proper kafka-datastore fields in the environment
field.

```
@Bean
public Function<KafkaFeatureEvent, KafkaFeatureEvent> passThrough() {
return event -> event;
}
```
```
spring:
cloud:
function:
definition: passThrough
stream:
kafka-datastore.binder:
kafka-brokers: kafka:9092
kafka-zookeepers: zookeeper:2181
function.bindings:
passThrough-in-0: input
passThrough-out-0: output
binders:
kds-start:
type: kafka-datastore
environment:
spring.cloud.stream.kafka-datastore.binder:
kafka-zk-path: geomesa/start
kds-end:
type: kafka-datastore
environment:
spring.cloud.stream.kafka-datastore.binder:
kafka-zk-path: geomesa/end
bindings:
input:
destination: observations
group: sft-reader
binder: kds-start
output:
destination: observations
group: sft-writer
binder: kds-end

```
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.locationtech.geomesa</groupId>
<artifactId>geomesa-spring</artifactId>
<version>5.1.0-SNAPSHOT</version>
</parent>

<artifactId>geomesa-spring-cloud-stream-binder-kafka-datastore</artifactId>

<properties>
<maven.compiler.source>11</maven.compiler.source>
<maven.compiler.target>11</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>

<dependencies>
<dependency>
<groupId>org.locationtech.geomesa</groupId>
<artifactId>geomesa-kafka-datastore_2.12</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-log4j2</artifactId>
</dependency>
<!-- NOTE: needed to work with GeoMesa -->
<dependency>
<groupId>org.apache.curator</groupId>
<artifactId>curator-client</artifactId>
<exclusions>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.curator</groupId>
<artifactId>curator-framework</artifactId>
</dependency>
<dependency>
<groupId>org.apache.curator</groupId>
<artifactId>curator-recipes</artifactId>
</dependency>

<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-to-slf4j</artifactId>
</exclusion>
</exclusions>
</dependency>
</dependencies>

</project>
Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
/***********************************************************************
* Copyright (c) 2013-2024 Commonwealth Computer Research, Inc.
* All rights reserved. This program and the accompanying materials
* are made available under the terms of the Apache License, Version 2.0
* which accompanies this distribution and is available at
* http://www.opensource.org/licenses/apache2.0.php.
***********************************************************************/

package org.locationtech.geomesa.spring.binder.kafka.datastore;

import org.geotools.api.data.DataStore;
import org.geotools.api.data.DataStoreFinder;
import org.locationtech.geomesa.spring.binder.kafka.datastore.converters.SimpleFeatureConverter;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.autoconfigure.condition.ConditionalOnMissingBean;
import org.springframework.boot.autoconfigure.context.PropertyPlaceholderAutoConfiguration;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Import;

import java.io.IOException;
import java.io.Serializable;
import java.util.HashMap;
import java.util.Map;
import java.util.function.Supplier;

@Configuration
@Import({ PropertyPlaceholderAutoConfiguration.class })
@EnableConfigurationProperties({KafkaDatastoreBinderConfigurationProperties.class})
public class KafkaDatastoreBinderConfiguration {
private static final Logger logger = LoggerFactory.getLogger(KafkaDatastoreBinderConfiguration.class);

@Autowired
KafkaDatastoreBinderConfigurationProperties kafkaDatastoreBinderConfigurationProperties;

@Bean
@ConditionalOnMissingBean
public Supplier<DataStore> dsFactory() {
return () -> {
Map<String, Serializable> inParameters = new HashMap<>();
kafkaDatastoreBinderConfigurationProperties.getBinder()
.forEach((key, value) -> inParameters.put(key.replace('-', '.'), value));
logger.info("Binder config: {}", kafkaDatastoreBinderConfigurationProperties.getBinder());
logger.info("Connecting to the KDS with params: {}", inParameters);

try {
return DataStoreFinder.getDataStore(inParameters);
} catch (IOException e) {
throw new RuntimeException(e);
}
};
}


@Bean
@ConditionalOnMissingBean
public KafkaDatastoreBinderProvisioner kafkaDatastoreBinderProvisioner() {
return new KafkaDatastoreBinderProvisioner();
}

@Bean
@ConditionalOnMissingBean
public KafkaDatastoreMessageBinder kafkaDatastoreMessageBinder(KafkaDatastoreBinderProvisioner kafkaDatastoreBinderProvisioner) {
return new KafkaDatastoreMessageBinder(null, kafkaDatastoreBinderProvisioner, dsFactory());
}

@Bean
@ConditionalOnMissingBean
public SimpleFeatureConverter simpleFeatureConverter() {
return new SimpleFeatureConverter();
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
/***********************************************************************
* Copyright (c) 2013-2024 Commonwealth Computer Research, Inc.
* All rights reserved. This program and the accompanying materials
* are made available under the terms of the Apache License, Version 2.0
* which accompanies this distribution and is available at
* http://www.opensource.org/licenses/apache2.0.php.
***********************************************************************/

package org.locationtech.geomesa.spring.binder.kafka.datastore;

import org.springframework.boot.context.properties.ConfigurationProperties;

import java.io.Serializable;
import java.util.HashMap;
import java.util.Map;

@ConfigurationProperties(prefix = "spring.cloud.stream.kafka-datastore")
public class KafkaDatastoreBinderConfigurationProperties {
public Map<String, ? extends Serializable> binder = new HashMap<>();

public Map<String, ? extends Serializable> getBinder() {
return binder;
}

public void setBinder(Map<String, ? extends Serializable> additionalProperties) {
this.binder = additionalProperties;
}
}
Loading

0 comments on commit 24dc656

Please sign in to comment.