diff --git a/README.md b/README.md index 291c70970..350a1e6dd 100644 --- a/README.md +++ b/README.md @@ -72,28 +72,25 @@ Kafbat UI wraps major functions of Apache Kafka with an intuitive user interface ![Interface](documentation/images/Interface.gif) ## Topics -Kafbat UI makes it easy for you to create topics in your browser by several clicks, -pasting your own parameters, and viewing topics in the list. +Kafbat UI makes it easy for you to create topics in your browser with just a few clicks, by pasting your own parameters, and viewing topics in the list. ![Create Topic](documentation/images/Create_topic_kafka-ui.gif) -It's possible to jump from connectors view to corresponding topics and from a topic to consumers (back and forth) for more convenient navigation. -connectors, overview topic settings. +You can jump from the connectors view to corresponding topics and from a topic to consumers (back and forth) for more convenient navigation, including connectors and overview topic settings. ![Connector_Topic_Consumer](documentation/images/Connector_Topic_Consumer.gif) ### Messages -Let's say we want to produce messages for our topic. With the Kafbat UI we can send or write data/messages to the Kafka topics without effort by specifying parameters, and viewing messages in the list. +Suppose you want to produce messages for your topic. With Kafbat UI, you can easily send or write data/messages to Kafka topics by specifying parameters and viewing messages in the list. ![Produce Message](documentation/images/Create_message_kafka-ui.gif) ## Schema registry -There are 3 supported types of schemas: Avro®, JSON Schema, and Protobuf schemas. +There are three supported types of schemas: Avro®, JSON Schema, and Protobuf schemas. ![Create Schema Registry](documentation/images/Create_schema.gif) -Before producing avro/protobuf encoded messages, you have to add a schema for the topic in Schema Registry. Now all these steps are easy to do -with a few clicks in a user-friendly interface. +Before producing Avro/Protobuf encoded messages, you need to add a schema for the topic in the Schema Registry. All these steps are now easy to do with just a few clicks in a user-friendly interface. ![Avro Schema Topic](documentation/images/Schema_Topic.gif) @@ -111,7 +108,7 @@ docker run -it -p 8080:8080 -e DYNAMIC_CONFIG_ENABLED=true ghcr.io/kafbat/kafka- Then access the web UI at [http://localhost:8080](http://localhost:8080) -The command is sufficient to try things out. When you're done trying things out, you can proceed with a [persistent installation](https://ui.docs.kafbat.io/quick-start/persistent-start) +This command is sufficient to try things out. When you're done, you can proceed with a [persistent installation](https://ui.docs.kafbat.io/quick-start/persistent-start). ## Persistent installation @@ -146,24 +143,24 @@ Please refer to our [configuration](https://ui.docs.kafbat.io/configuration/conf ## Building from sources -[Quick start](https://ui.docs.kafbat.io/development/building/prerequisites) with building +[Quick start](https://ui.docs.kafbat.io/development/building/prerequisites) for building from source ## Liveliness and readiness probes -Liveliness and readiness endpoint is at `/actuator/health`.
-Info endpoint (build info) is located at `/actuator/info`. +The liveness and readiness endpoint is at `/actuator/health`.
+The info endpoint (build info) is located at `/actuator/info`. # Configuration options -All the environment variables/config properties could be found [here](https://ui.docs.kafbat.io/configuration/misc-configuration-properties). +All environment variables and configuration properties can be found [here](https://ui.docs.kafbat.io/configuration/misc-configuration-properties). # Contributing -Please refer to [contributing guide](https://ui.docs.kafbat.io/development/contributing), we'll guide you from there. +Please refer to the [contributing guide](https://ui.docs.kafbat.io/development/contributing); we'll guide you from there. # Support As we're fully independent, team members contribute in their free time. -Your support is crucial for us, if you wish to sponsor us, take a look [here](https://github.com/sponsors/kafbat) +Your support is crucial for us, if you wish to sponsor us, take a look [here](https://github.com/sponsors/kafbat) # Powered by diff --git a/api/src/main/java/io/kafbat/ui/config/McpConfig.java b/api/src/main/java/io/kafbat/ui/config/McpConfig.java index c2a0ef4eb..b583aab9e 100644 --- a/api/src/main/java/io/kafbat/ui/config/McpConfig.java +++ b/api/src/main/java/io/kafbat/ui/config/McpConfig.java @@ -44,12 +44,12 @@ public McpAsyncServer mcpServer(WebFluxSseServerTransportProvider transport) { // Configure server capabilities with resource support var capabilities = McpSchema.ServerCapabilities.builder() .resources(false, true) - .tools(true) // Tool support with list changes notifications + .tools(true) // Tools support with list changes notifications .prompts(false) // Prompt support with list changes notifications .logging() // Logging support .build(); - // Create the server with both tool and resource capabilities + // Create the server with both tools and resource capabilities return McpServer.async(transport) .serverInfo("Kafka UI MCP", "0.0.1") .capabilities(capabilities) diff --git a/documentation/compose/DOCKER_COMPOSE.md b/documentation/compose/DOCKER_COMPOSE.md index 30585166b..06bf28afa 100644 --- a/documentation/compose/DOCKER_COMPOSE.md +++ b/documentation/compose/DOCKER_COMPOSE.md @@ -1,13 +1,13 @@ # Descriptions of docker-compose configurations (*.yaml) -1. [kafka-ui.yaml](./kafbat-ui.yaml) - Default configuration with 2 kafka clusters with two nodes of Schema Registry, one kafka-connect and a few dummy topics. -2. [kafka-ui-ssl.yml](./kafka-ssl.yml) - Connect to Kafka via TLS/SSL -3. [kafka-cluster-sr-auth.yaml](./cluster-sr-auth.yaml) - Schema registry with authentication. +1. [kafka-ui.yaml](./kafbat-ui.yaml) - Default configuration with 2 Kafka clusters with two nodes of Schema Registry, one Kafka Connect, and a few dummy topics. +2. [kafka-ui-ssl.yml](./kafka-ssl.yml) - Connect to Kafka via TLS/SSL. +3. [kafka-cluster-sr-auth.yaml](./cluster-sr-auth.yaml) - Schema Registry with authentication. 4. [kafka-ui-auth-context.yaml](./auth-context.yaml) - Basic (username/password) authentication with custom path (URL) (issue 861). -5. [e2e-tests.yaml](./e2e-tests.yaml) - Configuration with different connectors (github-source, s3, sink-activities, source-activities) and Ksql functionality. -6. [kafka-ui-jmx-secured.yml](./ui-jmx-secured.yml) - Kafka’s JMX with SSL and authentication. -7. [kafka-ui-reverse-proxy.yaml](./nginx-proxy.yaml) - An example for using the app behind a proxy (like nginx). -8. [kafka-ui-sasl.yaml](./ui-sasl.yaml) - SASL auth for Kafka. -9. [kafka-ui-traefik-proxy.yaml](./traefik-proxy.yaml) - Traefik specific proxy configuration. -10. [kafka-ui-with-jmx-exporter.yaml](./ui-with-jmx-exporter.yaml) - A configuration with 2 kafka clusters with enabled prometheus jmx exporters instead of jmx. -11. [kafka-with-zookeeper.yaml](./kafka-zookeeper.yaml) - An example for using kafka with zookeeper +5. [e2e-tests.yaml](./e2e-tests.yaml) - Configuration with different connectors (github-source, s3, sink-activities, source-activities) and KSQL functionality. +6. [kafka-ui-jmx-secured.yml](./ui-jmx-secured.yml) - Kafka's JMX with SSL and authentication. +7. [kafka-ui-reverse-proxy.yaml](./nginx-proxy.yaml) - An example of using the app behind a proxy (like nginx). +8. [kafka-ui-sasl.yaml](./ui-sasl.yaml) - SASL authentication for Kafka. +9. [kafka-ui-traefik-proxy.yaml](./traefik-proxy.yaml) - Traefik-specific proxy configuration. +10. [kafka-ui-with-jmx-exporter.yaml](./ui-with-jmx-exporter.yaml) - A configuration with 2 Kafka clusters with enabled Prometheus JMX exporters instead of JMX. +11. [kafka-with-zookeeper.yaml](./kafka-zookeeper.yaml) - An example of using Kafka with ZooKeeper. diff --git a/e2e-playwright/README.md b/e2e-playwright/README.md index 54b7b7b0a..91a986707 100644 --- a/e2e-playwright/README.md +++ b/e2e-playwright/README.md @@ -10,7 +10,7 @@ End-to-End UI test automation using **Playwright**, **Cucumber.js**, and **TypeS ```bash Local run: -Run kafbat (docker compose -f ./documentation/compose/e2e-tests.yaml up -d) +Run Kafbat (docker compose -f ./documentation/compose/e2e-tests.yaml up -d) npm install npx playwright install @@ -24,7 +24,7 @@ npm run debug npm run test:failed -Gihub action CI example +GitHub Actions CI example name: CI on: @@ -53,4 +53,4 @@ jobs: - name: 🚀 Run tests with ENV=prod run: ENV=prod HEAD=false BASEURL=http://localhost:8080 npm run test - \ No newline at end of file +``` diff --git a/frontend/README.md b/frontend/README.md index a2f831111..aae2ae2fb 100644 --- a/frontend/README.md +++ b/frontend/README.md @@ -16,7 +16,7 @@ Web UI for managing Apache Kafka clusters ## Getting started -Go to react app folder +Go to the React app folder ```sh cd ./frontend ``` @@ -42,7 +42,7 @@ Install dependencies pnpm install ``` -Generate API clients from OpenAPI document +Generate API clients from the OpenAPI document ```sh pnpm gen:sources ``` @@ -50,7 +50,7 @@ pnpm gen:sources ## Start application ### Proxying API Requests in Development -Create or update existing `.env.local` file with +Create or update the existing `.env.local` file with ``` VITE_DEV_PROXY= https://api.server # your API server ``` @@ -62,14 +62,14 @@ pnpm dev ### Docker way -Have to be run from root directory. +Must be run from the root directory. Start Kafbat UI with your Kafka clusters: ```sh docker-compose -f ./documentation/compose/kafbat-ui.yaml up ``` -Make sure that none of the `.env*` files contain `DEV_PROXY` variable +Make sure that none of the `.env*` files contain the `DEV_PROXY` variable Run the application ```sh diff --git a/frontend/src/lib/hooks/api/ksqlDb.tsx b/frontend/src/lib/hooks/api/ksqlDb.tsx index e562e7c2f..8724250ec 100644 --- a/frontend/src/lib/hooks/api/ksqlDb.tsx +++ b/frontend/src/lib/hooks/api/ksqlDb.tsx @@ -41,7 +41,7 @@ export function useExecuteKsqlkDbQueryMutation() { const getFormattedErrorFromTableData = ( responseValues: KsqlTableResponse['values'] ): { title: string; message: string } => { - // We expect someting like that + // We expect something like that // [[ // "@type", // "error_code", @@ -55,7 +55,7 @@ const getFormattedErrorFromTableData = ( if (!responseValues || !responseValues.length) { return { title: 'Unknown error', - message: 'Recieved empty response', + message: 'Received empty response', }; } diff --git a/frontend/src/lib/permissions.ts b/frontend/src/lib/permissions.ts index e72ffecf9..fba021532 100644 --- a/frontend/src/lib/permissions.ts +++ b/frontend/src/lib/permissions.ts @@ -54,22 +54,22 @@ const valueMatches = (regexp: string | undefined, val: string | undefined) => { }; /** - * @description it the logic behind depending on the roles whether a certain action - * is permitted or not the philosophy is inspired from Headless UI libraries where - * you separate the logic from the renderer besides the Creation process which is handled by isPermittedToCreate + * @description The logic behind determining whether a certain action + * is permitted or not depending on the roles. The philosophy is inspired by Headless UI libraries where + * you separate the logic from the renderer, besides the Creation process which is handled by isPermittedToCreate. * - * Algorithm: we Mapped the cluster name and the resource name , because all the actions in them are - * constant and limited and hence faster lookup approach + * Algorithm: We mapped the cluster name and the resource name, because all the actions in them are + * constant and limited, and hence a faster lookup approach. * - * @example you can use this in the hook format where it used in , or if you want to calculate it dynamically - * you can call this dynamically in your component but the render is on you from that point on + * @example You can use this in the hook format where it is used, or if you want to calculate it dynamically, + * you can call this dynamically in your component, but the render is on you from that point on. * - * Don't use this anywhere , use the hook version in the component for declarative purposes + * Don't use this anywhere; use the hook version in the component for declarative purposes. * - * Array action approach bear in mind they should be from the same resource with the same name restrictions, then the logic it - * will try to find every element from the given array inside the permissions data + * Array action approach: bear in mind they should be from the same resource with the same name restrictions; then the logic + * will try to find every element from the given array inside the permissions data. * - * DON'T use the array approach until it is necessary to do so + * DON'T use the array approach unless it is necessary to do so. * * */ export function isPermitted({ @@ -113,15 +113,15 @@ export function isPermitted({ /** * @description it the logic behind depending on create roles, since create has extra custom permission logic that is why - * it is seperated from the others + * * it is seperated from the others * - * Algorithm: we Mapped the cluster name and the resource name , because all the actions in them are - * constant and limited and hence faster lookup approach + * Algorithm: We mapped the cluster name and the resource name, because all the actions in them are + * constant and limited, and hence faster lookup approach. * - * @example you can use this in the hook format where it used in , or if you want to calculate it dynamically - * you can call this dynamically in your component but the render is on you from that point on + * @example You can use this in the hook format where it is used, or if you want to calculate it dynamically, + * you can call this dynamically in your component, but the render is on you from that point on. * - * Don't use this anywhere , use the hook version in the component for declarative purposes + * Don't use this anywhere; use the hook version in the component for declarative purposes. * * */ export function isPermittedToCreate({ diff --git a/serde-api/src/main/java/io/kafbat/ui/serde/api/DeserializeResult.java b/serde-api/src/main/java/io/kafbat/ui/serde/api/DeserializeResult.java index 8ae6b5202..c71273da0 100644 --- a/serde-api/src/main/java/io/kafbat/ui/serde/api/DeserializeResult.java +++ b/serde-api/src/main/java/io/kafbat/ui/serde/api/DeserializeResult.java @@ -14,11 +14,11 @@ public final class DeserializeResult { */ public enum Type { /** - * Content is the string. Will be shown as is. + * Content is a string. Will be shown as is. */ STRING, /** - * Content is the json object. Will be parsed by Jackson object mapper. + * Content is a JSON object. Will be parsed by the Jackson object mapper. */ JSON ; @@ -42,7 +42,7 @@ public DeserializeResult(String result, Type type, Map additiona } /** - * Getters for result. + * Getter for result. * @return string representation of deserialized binary data, can be null */ public String getResult() { @@ -50,8 +50,8 @@ public String getResult() { } /** - * Will be show as json dictionary in UI (serialized with Jackson object mapper). - * @return additional information about deserialized value. + * Will be shown as a JSON dictionary in the UI (serialized with the Jackson object mapper). + * @return additional information about the deserialized value. * It is recommended to use primitive types and strings for values. */ public Map getAdditionalProperties() { @@ -59,9 +59,9 @@ public Map getAdditionalProperties() { } /** - * Type of deserialized result. Will be used as hint for some internal logic - * @return type of deserialized result. Will be used as hint for some internal logic - * (ex. if type==STRING smart filters won't try to parse it as json for further usage) + * Type of deserialized result. Will be used as a hint for some internal logic. + * @return type of deserialized result. Will be used as a hint for some internal logic + * (ex., if type==STRING, smart filters won't try to parse it as JSON for further usage) */ public Type getType() { return type; diff --git a/serde-api/src/main/java/io/kafbat/ui/serde/api/PropertyResolver.java b/serde-api/src/main/java/io/kafbat/ui/serde/api/PropertyResolver.java index 74a797907..2553cfedc 100644 --- a/serde-api/src/main/java/io/kafbat/ui/serde/api/PropertyResolver.java +++ b/serde-api/src/main/java/io/kafbat/ui/serde/api/PropertyResolver.java @@ -6,8 +6,8 @@ /** * Provides access to configuration properties. - *Actual implementation uses {@code org.springframework.boot.context.properties.bind.Binder} class - * to bind values to target types. Target type params can be custom configs classes, not only simple types and strings. + * Actual implementation uses {@code org.springframework.boot.context.properties.bind.Binder} class + * to bind values to target types. Target type params can be custom config classes, not only simple types and strings. * */ public interface PropertyResolver { @@ -17,30 +17,30 @@ public interface PropertyResolver { * @param the type of the property * @param key property name * @param targetType type of property value - * @return property value or empty {@code Optional} if property not found + * @return property value or empty {@code Optional} if property is not found */ Optional getProperty(String key, Class targetType); /** - * Get list-property value by name + * Get list property value by name. * * @param the type of the item * @param key list property name * @param itemType type of list element - * @return list property value or empty {@code Optional} if property not found + * @return list property value or empty {@code Optional} if property is not found */ Optional> getListProperty(String key, Class itemType); /** - * Get map-property value by name + * Get map property value by name. * - * @param key map-property name + * @param key map property name * @param keyType type of map key * @param valueType type of map value * @param the type of the key * @param the type of the value - * @return map-property value or empty {@code Optional} if property not found + * @return map property value or empty {@code Optional} if property is not found */ Optional> getMapProperty(String key, Class keyType, Class valueType); diff --git a/serde-api/src/main/java/io/kafbat/ui/serde/api/SchemaDescription.java b/serde-api/src/main/java/io/kafbat/ui/serde/api/SchemaDescription.java index 9064428f0..196f8c760 100644 --- a/serde-api/src/main/java/io/kafbat/ui/serde/api/SchemaDescription.java +++ b/serde-api/src/main/java/io/kafbat/ui/serde/api/SchemaDescription.java @@ -12,9 +12,9 @@ public final class SchemaDescription { /** * Constructor for {@code SchemaDescription}. - * @param schema schema descriptions. - * If contains json-schema (preferred) UI will use it for validation and sample data generation. - * @param additionalProperties additional properties about schema (may be rendered in UI in the future) + * @param schema schema description. + * If it contains a JSON schema (preferred), the UI will use it for validation and sample data generation. + * @param additionalProperties additional properties about the schema (may be rendered in the UI in the future) */ public SchemaDescription(String schema, Map additionalProperties) { this.schema = schema; @@ -23,15 +23,15 @@ public SchemaDescription(String schema, Map additionalProperties /** * Schema description text. Can be null. - * @return schema description text. Preferably contains json-schema. Can be null. + * @return schema description text. Preferably contains a JSON schema. Can be null. */ public String getSchema() { return schema; } /** - * Additional properties about schema. - * @return additional properties about schema + * Additional properties about the schema. + * @return additional properties about the schema */ public Map getAdditionalProperties() { return additionalProperties; diff --git a/serde-api/src/main/java/io/kafbat/ui/serde/api/Serde.java b/serde-api/src/main/java/io/kafbat/ui/serde/api/Serde.java index 32ea45048..aa18512ff 100644 --- a/serde-api/src/main/java/io/kafbat/ui/serde/api/Serde.java +++ b/serde-api/src/main/java/io/kafbat/ui/serde/api/Serde.java @@ -5,22 +5,22 @@ import org.apache.kafka.common.header.Headers; /** - * Main interface of serialization/deserialization logic. - * It provides ability to serialize, deserialize topic's keys and values, and optionally provides - * information about data schema inside topic. + * Main interface for serialization/deserialization logic. + * It provides the ability to serialize, deserialize topic's keys and values, and optionally provides + * information about data schema inside a topic. *

* Lifecycle:
* 1. on application startup kafbat-ui scans configs and finds all custom serde definitions
* 2. for each custom serde its own separated child-first classloader is created
- * 3. kafbat-ui loads class defined in configuration and instantiates instance of that class using default, non-arg constructor
+ * 3. kafbat-ui loads the class defined in configuration and instantiates an instance of that class using the default, non-arg constructor
* 4. {@code configure(...)} method called
* 5. various methods called during application runtime
* 6. on application shutdown kafbat-ui calls {@code close()} method on serde instance
*

* Implementation considerations:
- * 1. Implementation class should have default/non-arg contructor
+ * 1. Implementation class should have a default/non-arg constructor
* 2. All methods except {@code configure(...)} and {@code close()} can be called from different threads. So, your code should be thread-safe.
- * 3. All methods will be executed in separate child-first classloader.
+ * 3. All methods will be executed in a separate child-first classloader.
*/ public interface Serde extends Closeable { @@ -39,10 +39,10 @@ enum Target { } /** - * Reads configuration using property resolvers and sets up serde's internal state. + * Reads configuration using property resolvers and sets up the serde's internal state. * * @param serdeProperties specific serde instance's properties - * @param kafkaClusterProperties properties of the custer for what serde is instantiated + * @param kafkaClusterProperties properties of the cluster for which serde is instantiated * @param globalProperties global application properties */ void configure( @@ -53,32 +53,32 @@ void configure( /** * Get serde's description. - * @return Serde's description. Treated as Markdown text. Will be shown in UI. + * @return Serde's description. Treated as Markdown text. Will be shown in the UI. */ Optional getDescription(); /** - * Get schema description for specified topic's key/value. + * Get schema description for the specified topic's key/value. * @param topic topic name * @param type {@code Target} for which {@code SchemaDescription} will be returned. - * @return SchemaDescription for specified topic's key/value. - * {@code Optional.empty} if there is not information about schema. + * @return SchemaDescription for the specified topic's key/value. + * {@code Optional.empty} if there is no information about the schema. */ Optional getSchema(String topic, Target type); /** - * Checks if this Serde can be applied to specified topic's key/value deserialization. + * Checks if this Serde can be applied to the specified topic's key/value deserialization. * @param topic topic name * @param type {@code Target} for which {@code Deserializer} will be applied. - * @return true if this Serde can be applied to specified topic's key/value deserialization + * @return true if this Serde can be applied to the specified topic's key/value deserialization */ boolean canDeserialize(String topic, Target type); /** - * Checks if this Serde can be applied to specified topic's key/value serialization. + * Checks if this Serde can be applied to the specified topic's key/value serialization. * @param topic topic name * @param type {@code Target} for which {@code Serializer} will be applied. - * @return true if this Serde can be applied to specified topic's key/value serialization + * @return true if this Serde can be applied to the specified topic's key/value serialization */ boolean canSerialize(String topic, Target type); @@ -93,41 +93,41 @@ default void close() { //---------------------------------------------------------------------------- /** - * Creates {@code Serializer} for specified topic's key/value. - * kafbat-ui doesn't cache {@code Serializes} - new one will be created each time user's message needs to be serialized. + * Creates {@code Serializer} for the specified topic's key/value. + * kafbat-ui doesn't cache {@code Serializers} - a new one will be created each time a user's message needs to be serialized. * (Unless kafbat-ui supports batch inserts). * @param topic topic name * @param type {@code Target} for which {@code Serializer} will be created. - * @return {@code Serializer} for specified topic's key/value. + * @return {@code Serializer} for the specified topic's key/value. */ Serializer serializer(String topic, Target type); /** - * Creates {@code Deserializer} for specified topic's key/value. - * {@code Deserializer} will be created for each kafka polling and will be used for all messages within that polling cycle. + * Creates {@code Deserializer} for the specified topic's key/value. + * {@code Deserializer} will be created for each Kafka polling and will be used for all messages within that polling cycle. * @param topic topic name * @param type {@code Target} for which {@code Deserializer} will be created. - * @return {@code Deserializer} for specified topic's key/value. + * @return {@code Deserializer} for the specified topic's key/value. */ Deserializer deserializer(String topic, Target type); /** - * Serializes client's input to {@code bytes[]} that will be sent to kafka as key/value (depending on what {@code Type} it was created for). + * Serializes client's input to {@code byte[]} that will be sent to Kafka as key/value (depending on what {@code Type} it was created for). */ interface Serializer { /** * Serializes input string to bytes. - * @param input string entered by user into UI text field.
Note: this input is not formatted in any way. - * @return serialized bytes. Can be null if input is null or empty string. + * @param input string entered by the user into the UI text field.
Note: this input is not formatted in any way. + * @return serialized bytes. Can be null if input is null or an empty string. */ byte[] serialize(String input); /** * Serializes input string to bytes. Uses provided headers for additional information. - * @param input string entered by user into UI text field.
Note: this input is not formatted in any way. - * @param headers headers entered by user into UI text field.
Note: this input is not formatted in any way. - * @return serialized bytes. Can be null if input is null or empty string. + * @param input string entered by the user into the UI text field.
Note: this input is not formatted in any way. + * @param headers headers entered by the user into the UI text field.
Note: this input is not formatted in any way. + * @return serialized bytes. Can be null if input is null or an empty string. */ default byte[] serialize(String input, Headers headers) { return serialize(input); @@ -139,10 +139,10 @@ default byte[] serialize(String input, Headers headers) { */ interface Deserializer { /** - * Deserializes record's key/value to string. + * Deserializes record's key/value to a string. * @param headers record's headers * @param data record's key/value - * @return deserialized object. Can be null if input is null or empty string. + * @return deserialized object. Can be null if input is null or an empty string. */ DeserializeResult deserialize(RecordHeaders headers, byte[] data); }