You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/docs/asciidoc/_sink.adoc
+51-41Lines changed: 51 additions & 41 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -25,31 +25,22 @@ You can specify the number of tasks with the `tasks.max` configuration property.
25
25
26
26
[[data-structures]]
27
27
=== Redis Data Structures
28
-
29
-
Record keys and values have different roles depending on the target data structure.
28
+
The {name} supports the following Redis data-structure types as targets:
30
29
31
30
[[collection-key]]
32
-
==== Collections
33
-
For collections (stream, list, set, sorted set, timeseries) a single key is used which is independent of the record key.
34
-
35
-
Use the `redis.key` configuration property (default: `${topic}`) to specify a format string for the destination collection, which may contain `${topic}` as a placeholder for the originating topic name.
36
-
37
-
For example `kafka_${topic}` for the topic `orders` will map to the Redis key `kafka_orders`.
38
-
39
-
==== Stream
40
-
41
-
Use the following properties to store Kafka records as Redis stream messages:
Collection keys are generated using the `redis.key` configuration property which may contain `${topic}` (default) as a placeholder for the originating topic name.
34
+
+
35
+
For example with `redis.key = ${topic}` and topic `orders` the Redis key is `set:orders`.
For other data-structures the key is in the form `<keyspace>:<record_key>` where `keyspace` is generated using the `redis.key` configuration property like above and `record_key` is the sink record key.
40
+
+
41
+
For example with `redis.key = ${topic}`, topic `orders`, and sink record key `123` the Redis key is `orders:123`.
42
+
43
+
[[sync-hash]]
53
44
==== Hash
54
45
Use the following properties to write Kafka records as Redis hashes:
55
46
@@ -62,8 +53,9 @@ value.converter=<Avro or JSON> <2>
62
53
63
54
<1> <<key-string,String>> or <<key-bytes,bytes>>
64
55
<2> <<avro,Avro>> or <<kafka-json,JSON>>.
65
-
If value is null the key is https://redis.io/commands/del[deleted].
56
+
If value is null the key is deleted.
66
57
58
+
[[sync-string]]
67
59
==== String
68
60
Use the following properties to write Kafka records as Redis strings:
69
61
@@ -76,8 +68,38 @@ value.converter=<string or bytes> <2>
76
68
77
69
<1> <<key-string,String>> or <<key-bytes,bytes>>
78
70
<2> <<value-string,String>> or <<value-bytes,bytes>>.
79
-
If value is null the key is https://redis.io/commands/del[deleted].
71
+
If value is null the key is deleted.
72
+
73
+
[[sync-json]]
74
+
==== JSON
75
+
Use the following properties to write Kafka records as RedisJSON documents:
76
+
77
+
[source,properties]
78
+
----
79
+
redis.type=JSON
80
+
key.converter=<string or bytes> <1>
81
+
value.converter=<string or bytes> <2>
82
+
----
80
83
84
+
<1> <<key-string,String>> or <<key-bytes,bytes>>
85
+
<2> <<value-string,String>> or <<value-bytes,bytes>>.
86
+
If value is null the key is deleted.
87
+
88
+
[[sync-stream]]
89
+
==== Stream
90
+
Use the following properties to store Kafka records as Redis stream messages:
91
+
92
+
[source,properties]
93
+
----
94
+
redis.type=STREAM
95
+
redis.key=<stream key> <1>
96
+
value.converter=<Avro or JSON> <2>
97
+
----
98
+
99
+
<1> <<collection-key,Stream key>>
100
+
<2> <<avro,Avro>> or <<kafka-json,JSON>>
101
+
102
+
[[sync-list]]
81
103
==== List
82
104
Use the following properties to add Kafka record keys to a Redis list:
83
105
@@ -96,6 +118,7 @@ redis.push.direction=<LEFT or RIGHT> <3>
96
118
The Kafka record value can be any format.
97
119
If a value is null then the member is removed from the list (instead of pushed to the list).
98
120
121
+
[[sync-set]]
99
122
==== Set
100
123
Use the following properties to add Kafka record keys to a Redis set:
101
124
@@ -112,6 +135,7 @@ key.converter=<string or bytes> <2>
112
135
The Kafka record value can be any format.
113
136
If a value is null then the member is removed from the set (instead of added to the set).
114
137
138
+
[[sync-zset]]
115
139
==== Sorted Set
116
140
Use the following properties to add Kafka record keys to a Redis sorted set:
117
141
@@ -128,22 +152,8 @@ key.converter=<string or bytes> <2>
128
152
The Kafka record value should be `float64` and is used for the score.
129
153
If the score is null then the member is removed from the sorted set (instead of added to the sorted set).
130
154
131
-
[[redisjson]]
132
-
==== JSON
133
-
Use the following properties to write Kafka records as RedisJSON documents:
134
-
135
-
[source,properties]
136
-
----
137
-
redis.type=JSON
138
-
key.converter=<string or bytes> <1>
139
-
value.converter=<string or bytes> <2>
140
-
----
141
-
142
-
<1> <<key-string,String>> or <<key-bytes,bytes>>
143
-
<2> <<value-string,String>> or <<value-bytes,bytes>>.
144
-
If value is null the key is https://redis.io/commands/del[deleted].
145
-
146
-
==== TimeSeries
155
+
[[sync-timeseries]]
156
+
==== Time Series
147
157
148
158
Use the following properties to write Kafka records as RedisTimeSeries samples:
0 commit comments