You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: kafka-questdb-connector-samples/stocks/readme.md
+10-9Lines changed: 10 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -17,24 +17,25 @@ Bear in mind the sample starts multiple containers. It's running fine on my mach
17
17
3. Run `docker-compose build` to build docker images with the sample project. This will take a few minutes.
18
18
4. Run `docker-compose up` to start Postgres, Java stock price updater app, Apache Kafka, Kafka Connect with Debezium and QuestDB connectors, QuestDB and Grafana. This will take a few minutes.
19
19
5. The previous command will generate a lot of log messages. Eventually logging should cease. This means all containers are running.
20
-
6. In a separate shell, execute following command to start Debezium connector:
20
+
6. At this point we have all infrastructure running, the Java application keeps updating stock prices in Postgres. However, the rest of the pipeline is not yet running. We need to start the Kafka Connect connectors. Kafka Connect has a REST API, so we can use `curl` to start the connectors.
21
+
7. In a separate shell, execute following command to start Debezium connector:
21
22
```shell
22
23
curl -X POST -H "Content-Type: application/json" -d '{"name":"debezium_source","config":{"tasks.max":1,"database.hostname":"postgres","database.port":5432,"database.user":"postgres","database.password":"postgres","connector.class":"io.debezium.connector.postgresql.PostgresConnector","database.dbname":"postgres","database.server.name":"dbserver1"}} ' localhost:8083/connectors
23
24
```
24
25
It starts the Debezium connector that will capture changes from Postgres and feed them to Kafka.
25
-
7. Execute following command to start QuestDB Kafka Connect sink:
26
+
8. Execute following command to start QuestDB Kafka Connect sink:
It starts the QuestDB Kafka Connect sink that will read changes from Kafka and write them to QuestDB.
30
-
8. Go to [QuestDB Web Console](http://localhost:19000/) and execute following query:
31
+
9. Go to [QuestDB Web Console](http://localhost:19000/) and execute following query:
31
32
```sql
32
-
select* from stock
33
+
select* from stock;
33
34
```
34
-
It should return some rows. If it does not return any rows,wait a few seconds and try again.
35
-
9. Go to [Grafana Dashboard](http://localhost:3000/d/stocks/stocks?orgId=1&refresh=5s&viewPanel=2). It should show some data. If it does not show any data, wait a few seconds, refresh try again.
36
-
10. Play with the Grafana dashboard a bit. You can change the aggregation interval, change stock, zoom-in and zoom-out, etc.
37
-
11. Go to [QuestDB Web Console](http://localhost:19000/) again and execute following query:
35
+
It should return some rows. If it does not return any rows or returns a _table not found_ error thenwait a few seconds and try again.
36
+
10. Go to [Grafana Dashboard](http://localhost:3000/d/stocks/stocks?orgId=1&refresh=5s&viewPanel=2). It should show some data. If it does not show any data, wait a few seconds, refresh try again.
37
+
11. Play with the Grafana dashboard a bit. You can change the aggregation interval, change stock, zoom-in and zoom-out, etc.
38
+
12. Go to [QuestDB Web Console](http://localhost:19000/) again and execute following query:
38
39
```sql
39
40
SELECT
40
41
timestamp,
@@ -47,7 +48,7 @@ Bear in mind the sample starts multiple containers. It's running fine on my mach
47
48
SAMPLE by 1m align to calendar;
48
49
```
49
50
It returns the average, minimum and maximum stock price forIBMin each minute. You can change the `1m` to `1s` to get data aggregated by second. The `SAMPLE by` shows a bit of QuestDB syntax sugar to make time-related queries more readable.
50
-
12. Don't forget to stop the containers when you're done. The project generates a lot of data and you could run out of disk space.
51
+
13. Don't forget to stop the containers when you're done. The project generates a lot of data and you could run out of disk space.
0 commit comments