Skip to content

Commit e691497

Browse files
committed
chore: a sample with Confluent Docker images
1 parent 6cb58b5 commit e691497

File tree

6 files changed

+153
-2
lines changed

6 files changed

+153
-2
lines changed
Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
FROM confluentinc/cp-kafka-connect-base:7.3.2
2+
RUN confluent-hub install --no-prompt questdb/kafka-questdb-connector:0.6
Lines changed: 68 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,68 @@
1+
version: '2.1'
2+
services:
3+
kafka-ui:
4+
container_name: kafka-ui
5+
image: provectuslabs/kafka-ui:latest
6+
ports:
7+
- 8080:8080
8+
depends_on:
9+
- kafka1
10+
- connect
11+
links:
12+
- kafka1:kafka1
13+
environment:
14+
KAFKA_CLUSTERS_0_NAME: kafka
15+
KAFKA_CLUSTERS_0_BOOTSTRAPSERVERS: kafka1:9092
16+
KAFKA_CLUSTERS_0_METRICS_PORT: 9997
17+
KAFKA_CLUSTERS_0_KAFKACONNECT_0_NAME: connect
18+
KAFKA_CLUSTERS_0_KAFKACONNECT_0_ADDRESS: http://connect:8083
19+
questdb:
20+
image: questdb/questdb:7.0.1
21+
expose:
22+
- "9009"
23+
ports:
24+
- "9000:9000"
25+
environment:
26+
- JAVA_OPTS=-Djava.locale.providers=JRE,SPI
27+
zookeeper:
28+
image: zookeeper:3.6.2
29+
ports:
30+
- "2181:2181"
31+
kafka1:
32+
image: confluentinc/cp-kafka:7.3.2
33+
ports:
34+
- "9092:9092"
35+
depends_on:
36+
- zookeeper
37+
links:
38+
- zookeeper:zookeeper
39+
environment:
40+
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
41+
KAFKA_ADVERTISED_LISTENERS: "PLAINTEXT://kafka1:9092"
42+
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
43+
KAFKA_JMX_PORT: 9997
44+
KAFKA_JMX_OPTS: -Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote.authenticate=false -Dcom.sun.management.jmxremote.ssl=false -Djava.rmi.server.hostname=kafka0 -Dcom.sun.management.jmxremote.rmi.port=9997
45+
connect:
46+
image: kafka-connect-with-questdb
47+
build:
48+
dockerfile: ./Dockerfile
49+
context: .
50+
depends_on:
51+
- kafka1
52+
- questdb
53+
links:
54+
- kafka1:kafka1
55+
- questdb:questdb
56+
environment:
57+
CONNECT_BOOTSTRAP_SERVERS: "kafka1:9092"
58+
CONNECT_GROUP_ID: "quest_grp"
59+
CONNECT_CONFIG_STORAGE_TOPIC: _connect_configs
60+
CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR: 1
61+
CONNECT_OFFSET_STORAGE_TOPIC: _connect_offset
62+
CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR: 1
63+
CONNECT_STATUS_STORAGE_TOPIC: _connect_status
64+
CONNECT_STATUS_STORAGE_REPLICATION_FACTOR: 1
65+
CONNECT_KEY_CONVERTER: "org.apache.kafka.connect.storage.StringConverter"
66+
CONNECT_VALUE_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
67+
CONNECT_VALUE_CONVERTER_SCHEMAS_ENABLE: "false"
68+
CONNECT_REST_ADVERTISED_HOST_NAME: "connect"
141 KB
Loading
89.1 KB
Loading
Lines changed: 78 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,78 @@
1+
# Sample Project: Confluent Images
2+
This sample project shows how to use the [QuestDB Kafka connector](https://questdb.io/docs/third-party-tools/kafka/questdb-kafka/) together with Confluent Docker images for Kafka.
3+
The sample uses the [Confluent CP Kafka Connect](https://hub.docker.com/r/confluentinc/cp-kafka-connect-base) as a base for
4+
building a custom Docker image with the QuestDB Kafka connector installed. The QuestDB Kafka connector is installed from the
5+
[Confluent Hub](https://www.confluent.io/hub/questdb/kafka-questdb-connector).
6+
7+
8+
It also uses the [Kafka UI](https://github.com/provectus/kafka-ui) project for Kafka administration.
9+
10+
## Prerequisites:
11+
- Git
12+
- Working Docker environment, including docker-compose
13+
- Internet access to download dependencies
14+
15+
## Usage:
16+
- Clone this repository via `git clone https://github.com/questdb/kafka-questdb-connector.git`
17+
- `cd kafka-questdb-connector/kafka-questdb-connector-samples/confluent-docker-images` to enter the directory with this sample.
18+
- Run `docker compose build` to build a docker image with the sample project.
19+
- Run `docker compose up` to start the node.js producer, Apache Kafka and QuestDB containers.
20+
- The previous command will generate a lot of log messages. Eventually logging should cease. This means both Apache Kafka and QuestDB are running.
21+
- Go to http://localhost:8080/ui/clusters/kafka/connectors and click on the “Create Connector” button.
22+
![screenshot of Kafka UI, with the Create Connector button highlighted](img/create.png)
23+
- The connector name should be QuestDB, use the following configuration and click at Submit:
24+
```json
25+
{
26+
"connector.class": "io.questdb.kafka.QuestDBSinkConnector",
27+
"topics": "Orders",
28+
"host": "questdb:9009",
29+
"name": "questdb",
30+
"value.converter": "org.apache.kafka.connect.json.JsonConverter",
31+
"include.key": false,
32+
"key.converter": "org.apache.kafka.connect.storage.StringConverter",
33+
"table": "orders_table",
34+
"value.converter.schemas.enable": false
35+
}
36+
```
37+
- Go to http://localhost:8080/ui/clusters/kafka/all-topics/Orders and click on the “Produce Message” button. If the Topic is not created yet then try to refresh the page.
38+
- Use the following JSON as value, keep the rest of the fields as default and click at “Produce Message”:
39+
```json
40+
{"firstname": "Arthur", "lastname": "Dent", "age": 42}
41+
```
42+
- Go to [QuestDB web console](http://localhost:9000) and run the following query:
43+
```sql
44+
select * from orders_table
45+
```
46+
- You should see a result similar to this:
47+
![screenshot of QuestDB web console, with the result of the query](img/questdb.png)
48+
49+
## How it works
50+
The Docker Compose file starts the following containers:
51+
- Kafka broker - the message broker
52+
- Zookeeper - the coordination service for Kafka
53+
- Kafka Connect - the framework for running Kafka connectors
54+
- QuestDB - the fastest open-source time-series database
55+
- Kafka UI - the web UI for Kafka administration
56+
57+
The Kafka Connect container is built from the [Confluent CP Kafka Connect](https://hub.docker.com/r/confluentinc/cp-kafka-connect-base) by image by using the following Dockerfile:
58+
```dockerfile
59+
FROM confluentinc/cp-kafka-connect-base:7.3.2
60+
RUN confluent-hub install --no-prompt questdb/kafka-questdb-connector:0.6
61+
```
62+
The `confluent-hub` command installs the QuestDB Kafka connector from the [Confluent Hub](https://www.confluent.io/hub/questdb/kafka-questdb-connector).
63+
64+
When all containers are running then we use the Kafka UI to start the QuestDB connector. The connector is configured to read data from a Kafka topic `Orders` and write it to the QuestDB table `orders_table`.
65+
66+
Then we use the Kafka UI to produce a message to the `Orders` topic. The message is a JSON document with the following structure:
67+
```json
68+
{"firstname": "Arthur", "lastname": "Dent", "age": 42}
69+
```
70+
The Kafka Connect container receives the message and writes it to the QuestDB table.
71+
72+
Finally, we use the QuestDB web console to query the table and see the result!
73+
74+
## Further reading
75+
- [QuestDB Kafka connector](https://questdb.io/docs/third-party-tools/kafka/questdb-kafka/)
76+
77+
## Bugs and Feedback
78+
For bugs, questions and discussions please use the [Github Issues](https://github.com/questdb/kafka-questdb-connector/issues/new)
Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,11 @@
11
# Samples Projects
22

3-
There are 2 sample projects:
3+
There are 3 sample projects:
44
## [Faker](faker)
55
Simplistic project which uses a simple node.js application to create JSON entries in Apache Kafka and QuestDB Kafka Connect Sink to feed generated data from Kafka to QuestDB.
66

77
## [Stocks](stocks)
8-
This project uses Debezium to stream data from Postgres to Kafka and QuestDB Kafka Connect Sink to feed data from Kafka to QuestDB. It also uses Grafana to visualize the data.
8+
This project uses Debezium to stream data from Postgres to Kafka and QuestDB Kafka Connect Sink to feed data from Kafka to QuestDB. It also uses Grafana to visualize the data.
9+
10+
## [Confluent-Docker-Images](confluent-docker-images)
11+
This project uses Confluent Docker images to create a Kafka cluster and QuestDB Kafka Connect Sink to feed data from Kafka to QuestDB. It installs the QuestDB Kafka Connect Sink from the [Confluent Hub](https://www.confluent.io/hub/questdb/kafka-questdb-connector).

0 commit comments

Comments
 (0)