Skip to content

Commit ef62b81

Browse files
committed
[chore] make the readme.md to use released binary artifacts
also: make code samples QuestDB/Grafana URLs more explicit
1 parent 0e4480c commit ef62b81

File tree

3 files changed

+6
-9
lines changed

3 files changed

+6
-9
lines changed

kafka-questdb-connector-samples/faker/readme.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ The project was tested on MacOS with M1, but it should work on other platforms t
2020
$ curl -X POST -H "Content-Type: application/json" -d '{"name":"questdb-connect","config":{"topics":"People","connector.class":"io.questdb.kafka.QuestDBSinkConnector","tasks.max":"1","key.converter":"org.apache.kafka.connect.storage.StringConverter","value.converter":"org.apache.kafka.connect.json.JsonConverter","value.converter.schemas.enable":"false","host":"questdb", "timestamp.field.name": "birthday", "transforms":"convert_birthday","transforms.convert_birthday.type":"org.apache.kafka.connect.transforms.TimestampConverter$Value","transforms.convert_birthday.target.type":"Timestamp","transforms.convert_birthday.field":"birthday","transforms.convert_birthday.format": "yyyy-MM-dd'"'"'T'"'"'HH:mm:ss.SSSX"}}' localhost:8083/connectors
2121
```
2222
7. The command above will create a new Kafka connector that will read data from the `People` topic and write it to a QuestDB table called `People`. The connector will also convert the `birthday` field to a timestamp.
23-
8. Go to the [QuestDB console](http://localhost:19000) and run `select * from 'People';` and you should see some rows.
23+
8. Go to the QuestDB console running at http://localhost:19000 and run `select * from 'People';` and you should see some rows.
2424
9. Congratulations! You have successfully created a Kafka connector that reads data from a Kafka topic and writes it to a QuestDB table!
2525

2626
## How does it work?

kafka-questdb-connector-samples/stocks/readme.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -32,12 +32,12 @@ Bear in mind the sample starts multiple containers. It's running fine on my mach
3232
curl -X POST -H "Content-Type: application/json" -d '{"name":"questdb-connect","config":{"topics":"dbserver1.public.stock","table":"stock", "connector.class":"io.questdb.kafka.QuestDBSinkConnector","tasks.max":"1","key.converter":"org.apache.kafka.connect.storage.StringConverter","value.converter":"org.apache.kafka.connect.json.JsonConverter","host":"questdb", "transforms":"unwrap", "transforms.unwrap.type":"io.debezium.transforms.ExtractNewRecordState", "include.key": "false", "symbols": "symbol", "timestamp.field.name": "last_update"}}' localhost:8083/connectors
3333
```
3434
It starts the QuestDB Kafka Connect sink that will read changes from Kafka and write them to QuestDB.
35-
9. Go to [QuestDB Web Console](http://localhost:19000/) and execute following query:
35+
9. Go to QuestDB Web Console running at http://localhost:19000/ and execute following query:
3636
```sql
3737
select * from stock;
3838
```
3939
It should return some rows. If it does not return any rows or returns a _table not found_ error then wait a few seconds and try again.
40-
10. Go to [Grafana Dashboard](http://localhost:3000/d/stocks/stocks?orgId=1&refresh=5s&viewPanel=2). It should show some data. If it does not show any data, wait a few seconds, refresh try again.
40+
10. Go to Grafana Dashboard running at http://localhost:3000/d/stocks/stocks?orgId=1&refresh=5s&viewPanel=2. It should show some data. If it does not show any data, wait a few seconds, refresh try again.
4141
11. Play with the Grafana dashboard a bit. You can change the aggregation interval, change stock, zoom-in and zoom-out, etc.
4242
12. Go to [QuestDB Web Console](http://localhost:19000/) again and execute following query:
4343
```sql

readme.md

Lines changed: 3 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -8,8 +8,8 @@ The connector implements Apache Kafka [Sink Connector API](https://kafka.apache.
88

99
## Usage with Kafka Connect
1010
This guide assumes you are already familiar with Apache Kafka and Kafka Connect. If you are not then watch this [excellent video](https://www.youtube.com/watch?v=Jkcp28ki82k) or check our [sample projects](kafka-questdb-connector-samples).
11-
1. Unpack connector ZIP into Kafka Connect `./plugin/` directory.
12-
2. Start Kafka Connect
11+
1. [Download](https://github.com/questdb/kafka-questdb-connector/releases/latest) and unpack connector ZIP into Apache Kafka `./libs/` directory.
12+
2. Start Kafka Connect in the distributed mode.
1313
3. Create a connector configuration:
1414
```json
1515
{
@@ -21,10 +21,7 @@ This guide assumes you are already familiar with Apache Kafka and Kafka Connect.
2121
"table": "orders_table",
2222
"key.converter": "org.apache.kafka.connect.storage.StringConverter",
2323
"value.converter": "org.apache.kafka.connect.json.JsonConverter",
24-
"transforms": "unwrap",
25-
"transforms.unwrap.type": "io.debezium.transforms.ExtractNewRecordState",
26-
"include.key": "false",
27-
"timestamp.field.name": "created_at"
24+
"include.key": "false"
2825
}
2926
}
3027
```

0 commit comments

Comments
 (0)