Skip to content

Commit f690caa

Browse files
committed
Docs (WIP).
1 parent 2df667a commit f690caa

File tree

8 files changed

+9
-9
lines changed

8 files changed

+9
-9
lines changed

docs/Idempotency/RedisIdempotency.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ To install the package and start integrating with Redis:
1414
dotnet add package Confluent.Kafka.Core.Idempotency.Redis
1515
```
1616

17-
### Usage and Configuration :bar_chart:
17+
### Usage and Configuration :jigsaw:
1818
To configure the idempotency handler, use the `WithRedisIdempotencyHandler` method. This handler can be added to both worker and retry worker for idempotent processing.
1919

2020
To add the Redis idempotency handler to a Kafka worker:
@@ -65,7 +65,7 @@ builder.Services.AddKafka(builder =>
6565
/*.With...*/)))); // Additional options can be added here
6666
```
6767

68-
### Recommended Interface for Message Value :envelope_with_arrow:
68+
### Recommended Interface for Message Value :e-mail:
6969

7070
It is strongly recommended to implement the `IMessageValue` interface from the namespace `Confluent.Kafka.Core.Models` within the message value to create a standard and simplify message Id discovery. This interface includes the `Id` property as a `Guid` and can be used with the `MessageIdHandler`.
7171

docs/OpenTelemetry/OpenTelemetry.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ To install the package and start integrating with OpenTelemetry:
1818
dotnet add package Confluent.Kafka.Core.OpenTelemetry
1919
```
2020

21-
### Usage and Custom Enrichment :bar_chart:
21+
### Usage and Custom Enrichment :jigsaw:
2222

2323
To enable distributed tracing, call the `AddKafkaDiagnostics` method while registering Kafka Core services into the Microsoft built-in container. Below are some examples:
2424

docs/Serialization/JsonCore.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ To install the package and start integrating with System.Text.Json:
1212
dotnet add package Confluent.Kafka.Core.Serialization.JsonCore
1313
```
1414

15-
### Usage and Options Configuration :bar_chart:
15+
### Usage and Options Configuration :jigsaw:
1616

1717
There are multiple ways to configure the JsonCore serializer for your Kafka producer and consumer, allowing you to set the serializer for either the Key, the Value, or both, depending on your use case. The System.Text.Json library offers many options for configuring how JSON is handled in your messages. These options can be passed through the JsonCore serializer, providing fine-grained control over serialization and deserialization behavior. The options configuration is optional, and if not provided, default options will be assumed internally.
1818

docs/Serialization/JsonNET.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ To install the package and start integrating with Newtonsoft.Json:
1212
dotnet add package Confluent.Kafka.Core.Serialization.NewtonsoftJson
1313
```
1414

15-
### Usage and Settings Configuration :bar_chart:
15+
### Usage and Settings Configuration :jigsaw:
1616

1717
There are multiple ways to configure the Json.NET serializer for your Kafka producer and consumer, allowing you to set the serializer for either the Key, the Value, or both, depending on your use case. The Newtonsoft.Json library offers many settings for configuring how JSON is handled in your messages. These settings can be passed through the Json.NET serializer, providing fine-grained control over serialization and deserialization behavior. The settings configuration is optional, and if not provided, default settings will be assumed internally.
1818

docs/Serialization/ProtobufNet.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ To install the package and start integrating with protobuf-net:
1212
dotnet add package Confluent.Kafka.Core.Serialization.ProtobufNet
1313
```
1414

15-
### Usage and Options Configuration :bar_chart:
15+
### Usage and Options Configuration :jigsaw:
1616

1717
There are multiple ways to configure the protobuf-net serializer for your Kafka producer and consumer, allowing you to set the serializer for either the Key, the Value, or both, depending on your use case. The protobuf-net library offers some options for configuring how your messages are serialized using protocol buffers. These options can be passed through the protobuf-net serializer, providing fine-grained control over serialization and deserialization behavior. The options configuration is optional, and if not provided, default options will be assumed internally.
1818

docs/Serialization/SchemaRegistryAvro.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ To install the package and start integrating with Confluent.SchemaRegistry.Serde
1212
dotnet add package Confluent.Kafka.Core.Serialization.SchemaRegistry.Avro
1313
```
1414

15-
### Usage and Configuration :bar_chart:
15+
### Usage and Configuration :jigsaw:
1616

1717
There are multiple ways to configure the SchemaRegistry.Avro serializer for your Kafka producer and consumer, allowing you to set the serializer for either the Key, the Value, or both, depending on your use case. It provides several configurations for controlling how your messages interact with the Confluent Schema Registry. These configurations can be passed through the SchemaRegistry.Avro serializer, allowing for fine-grained control over schema registration, compatibility checks, and both serialization and deserialization behaviors. The Schema Registry client configuration is required, but the other configurations are optional, and if not provided, default configurations will be assumed internally.
1818

docs/Serialization/SchemaRegistryJson.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ To install the package and start integrating with Confluent.SchemaRegistry.Serde
1313
dotnet add package Confluent.Kafka.Core.Serialization.SchemaRegistry.Json
1414
```
1515

16-
### Usage and Configuration :bar_chart:
16+
### Usage and Configuration :jigsaw:
1717

1818
There are multiple ways to configure the SchemaRegistry.Json serializer for your Kafka producer and consumer, allowing you to set the serializer for either the Key, the Value, or both, depending on your use case. It provides several configurations for controlling how your messages interact with the Confluent Schema Registry. These configurations can be passed to the SchemaRegistry.Json serializer, allowing for fine-grained control over schema registration, compatibility checks, and both serialization and deserialization behaviors. The Schema Registry client configuration is required, but the other configurations are optional, and if not provided, default configurations will be assumed internally.
1919

docs/Serialization/SchemaRegistryProtobuf.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ To install the package and start integrating with Confluent.SchemaRegistry.Serde
1212
dotnet add package Confluent.Kafka.Core.Serialization.SchemaRegistry.Protobuf
1313
```
1414

15-
### Usage and Configuration :bar_chart:
15+
### Usage and Configuration :jigsaw:
1616

1717
There are multiple ways to configure the SchemaRegistry.Protobuf serializer for your Kafka producer and consumer, allowing you to set the serializer for either the Key, the Value, or both, depending on your use case. It provides several configurations for controlling how your messages interact with the Confluent Schema Registry. These configurations can be passed to the SchemaRegistry.Protobuf serializer, allowing for fine-grained control over schema registration, compatibility checks, and both serialization and deserialization behaviors. The Schema Registry client configuration is required, but the other configurations are optional, and if not provided, default configurations will be assumed internally.
1818

0 commit comments

Comments
 (0)