Skip to content

Commit

Permalink
docs: refine new quick start for producing and consuming messages in …
Browse files Browse the repository at this point in the history
…cli (#566)
  • Loading branch information
jbyrne-redhat authored Aug 11, 2022
1 parent c3fa820 commit 2f4f98d
Show file tree
Hide file tree
Showing 2 changed files with 138 additions and 88 deletions.
205 changes: 127 additions & 78 deletions docs/rhoas/rhoas-produce-consume/README.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -76,133 +76,182 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
END GENERATED ATTRIBUTES
////

[id="chap-produce-consume-rhoas-cli"]
= Getting started with producing and conuming messages in Rhoas Cli for {product-long-kafka}
[id="chap-producing-consuming-rhoas-cli"]
= Producing and consuming messages using the rhoas CLI
ifdef::context[:parent-context: {context}]
:context: getting-started-produce-consume
:context: produce-consume-rhoas-cli

// Purpose statement for the assembly
[role="_abstract"]
As a developer of applications and services, you can use the Rhoas Cli to create and read message in {product-long-kafka} and third-party systems.
As a developer of applications and services, you can use the `rhoas` command-line interface (CLI) to produce and consume messages for Kafka instances in {product-long-kafka}. This is a useful way to test and debug your Kafka instances.

In this example, you will produce messages to a kafka instance and consume them in the Rhoas Cli.
.Prerequisites
ifndef::community[]
* You have a Red Hat account.
endif::[]
* You have a running Kafka instance in {product-kafka}.
* You've installed the latest version of the `rhoas` CLI. See {base-url}{installation-guide-url-cli}[Installing and configuring the rhoas CLI^].

// Condition out QS-only content so that it doesn't appear in docs.
// All QS anchor IDs must be in this alternate anchor ID format `[#anchor-id]` because the ascii splitter relies on the other format `[id="anchor-id"]` to generate module files.
ifdef::qs[]
[#description]
====
Learn how to produce and consume message in {product-long-rhoas}.
Learn how to use the `rhoas` command-line interface (CLI) to produce and consume messages for a Kafka instance.
====

[#introduction]
====
Welcome to the quick start for producing and consuming messages in the Rhoas Cli.
In this quick start, you will learn how to produce messages to a kafka instance and consume them in the Rhoas Cli.
Welcome to the quick start for producing and consuming Kafka messages using the `rhoas` command-line interface (CLI).
In this quick start, you'll use a CLI command to produce messages to different topic partitions in a Kafka instance. You'll then use the {product-long-kafka} web console to inspect the messages. When you're ready, you'll use another CLI command to consume the messages.
====
endif::[]

ifndef::qs[]
== Overview

{product-long-kafka} is a cloud service that simplifies the process of running Apache Kafka. Apache Kafka is an open-source, distributed, publish-subscribe messaging system for creating fault-tolerant, real-time data feeds.

You can use {product-long-connectors} to configure communication between {product-kafka} instances and external services and applications. {product-long-connectors} allow you to configure how data moves from one endpoint to another without writing code.

The following diagram illustrates how data flows from a data source through a data source connector to a Kafka topic. And how data flows from a Kafka topic to a data sink through a data sink connector.

[.screencapture]
.{product-long-connectors} data flow
image::connectors-diagram.png[Illustration of data flow from data source through Kafka to data sink]

endif::[]

[id="proc-configure-kafka-instance_{context}"]
== Configuring the {product-kafka} instance
[id="proc-creating-kafka-topic-for-cli-production-consumption_{context}"]
== Creating a Kafka topic in {product-kafka}

[role="_abstract"]
After you create a {product-kafka} instance, configure by performing the following tasks:
In this task, you'll create a new topic in your Kafka instance. You'll use this topic in later tasks to produce and consume messages.

* Create *Kafka topics* to store messages sent by you and make them available to consumers.

For this example, you create one Kafka topic, named *test-topic* which will be used for all commands in the following examples.

ifdef::qs[]
.Prerequisites
* You've created a {product-kafka} instance and the instance is in the *Ready* state.
endif::[]
* You've created a Kafka instance in {product-long-kafka} and the instance is in the *Ready* state.

.Procedure
. Create a Kafka topic for your kafka instance:
.. In the OpenShift Application Services web console, select *Streams for Apache Kafka* > *Kafka Instances*.
.. Click the name of the {product-kafka} instance that you created.
.. Select the *Topics* tab, and then click *Create topic*.
.. Type a unique name for your topic. For example, type *test-topic* for *Topic Name*.
.. Accept the default settings for message retention, and replicas. But set the partition count to *2*.
. On the {service-url-kafka}[Kafka Instances^] page of the {product-kafka} web console, click the Kafka instance that you want to add a topic to.
. Select the *Topics* tab.
. Click *Create topic* and follow the guided steps to define the topic details.
+
You must specify the following topic properties:

* *Topic name*: For this quick start, enter `test-topic` as the topic name.
* *Partitions*: Set the number of partitions for the topic. For this quick start, set the value to `2`.
* *Message retention*: Set the message retention time and size. For this quick start, set the retention time to `A week` and the retention size to `Unlimited`.
* *Replicas*: For this release of {product-kafka}, replica values are preconfigured. The number of partition replicas and the minimum number of follower replicas that must be in sync with a partition leader are both set to `1`.
+
After you complete the setup, the new topic appears on the *Topics* page.

ifdef::qs[]
.Verification
* Did you create a topic for the kafka instance?
ifdef::qs[]
* Does `test-topic` appear on the *Topics* page?
endif::[]
ifndef::qs[]
* Verify that `test-topic` appears on the *Topics* page.
endif::[]


[id="proc-produce-message_{context}"]
== Producing a message to a {product-kafka} instance
[id="proc-producing-messages_{context}"]
== Producing messages to your Kafka instance

[role="_abstract"]
You can produce your own message from the Cli instead of using an application. This is very useful for testing and debuging your {product-kafka} instance.
When you have a Kafka instance with a topic, you're ready to use the CLI to produce messages. In this task, you'll produce four messages to your Kafka instance.

.Prerequisites
. You're logged in to the OpenShift Application Services web console at {service-url-connectors}[^].
. You configured a {product-kafka} instance for connectors as described _Configuring the {product-kafka} instance for use with {product-long-connectors}_.
. You are logged into the Rhoas Cli with your OpenShift Application Services account using `rhoas login`.
* You've created `test-topic` on your Kafka instance.

.Procedure
. To produce a message to your kafka topic use the following command `rhoas kafka topic produce --name=test-topic` and enter a value when prompted, for example enter `Hello world!`.

. Read the message
.. In the OpenShift Application Services web console, select *Streams for Apache Kafka* > *Kafka Instances*.
.. Click the name of the {product-kafka} instance that you created.
.. Select the *Topics* tab, and then click the name of your topic.
.. Select the *Messages* tab, and see the message you create from the Rhoas Cli

. By default any message you create is sent to the *0* partition. To create a message for the *1* partition run the following and enter another value `rhoas kafka topic produce --name=test-topic --partition=0`.

. Go back to the messages tab in the topic and see if your message is now on a different partition.
. Log in to the `rhoas` CLI.
+
[source]
----
$ rhoas login
----

. Specify the Kafka instance that you want to produce messages to.
+
[source,subs="+quotes"]
----
$ rhoas kafka use --name=_<my-kafka-instance>_
----

. Produce your first message to `test-topic`.
+
[source]
----
rhoas kafka topic produce --name=test-topic
----
+
You're prompted to enter a message value.

. Enter `First message` as the message value.

. Repeat the previous steps to produce your second and third messages to `test-topic`. Enter `Second message` and `Third message` as the message values.

. Produce your fourth and final message to `test-topic`. This time, specify a partition value of `1` and a custom message key. An example is shown below.
+
[source]
----
rhoas kafka topic produce --name=test-topic --partition=1 --key="{'location': 'us-east-1'}"
----

. Enter `Fourth message` as the value of the final message.

. To view the messages in the {product-long-kafka} {service-url-kafka}[web console^], perform the following actions:
.. On the *Kafka Instances* page, click the name of your Kafka instance.
.. Select the *Topics* tab, and then click `test-topic`.
.. Select the *Messages* tab.
+
The messages table shows the messages you produced.
.. Observe the following details about the messages you produced:
+
* Because you didn't specify a partition value when producing the first three messages, these messages all went to partition `0` by default.
* The three messages on partition `0` have offset values of `0`, `1`, and `2`.
* The final message is on partition `1` and has the custom key value that you specified.

.Verification
* Does running the commands produce messages?

.. In the OpenShift Application Services web console, select *Streams for Apache Kafka* > *Kafka Instances*.
.. Click the Kafka instance that you created.
.. Click the *Topics* tab and then click the topic that you specified for your {product-kafka} instance.
.. Click the *Messages* tab to see a `Hello World!` message.

ifdef::qs[]
* Does the messages table show the four messages that you produced?
endif::[]
ifndef::qs[]
* Verify that the messages table shows the four messages you produced.
endif::[]

[id="proc-consume-message_{context}"]
== Consuming messages from a {product-kafka} instance
[id="proc-consuming-messages_{context}"]
== Consuming messages from your Kafka instance
[role="_abstract"]
You can consume your own message from the Cli instead of using an application. This is very useful for testing and debuging your {product-kafka} instance.
When you've produced messages to your Kafka instance and verified them using the {product-long-kafka} web console, you're ready to consume the messages. In this task, you'll use the CLI to consume the messages.

.Prerequisites
. You're logged in to the OpenShift Application Services web console at {service-url-connectors}[^].
. You configured a {product-kafka} instance for connectors as described _Configuring the {product-kafka} instance for use with {product-long-connectors}_.
. You are logged into the Rhoas Cli with your OpenShift Application Services account using `rhoas login`.
* You've produced messages to `test-topic` in your Kafka instance.
* You're logged in to the `rhoas` CLI.

.Procedure
. To consume a message to your kafka topic use the following command `rhoas kafka topic consume --name=test-topic`. You will now see all messages you produced to the topic.
. Just like produce set the `--partition` flag to consume from a specific partition. Run `rhoas kafka topic consume --name=test-topic --partition=1`. You will now see all messages you produced to the topic on the *1* partition.
. Specify the Kafka instance that you want to consume messages from.
+
[source,subs="+quotes"]
----
$ rhoas kafka use --name=_<my-kafka-instance>_
----

. Consume the messages on partition `0` of `test-topic`.
+
[source]
----
rhoas kafka topic consume --name=test-topic --partition=0
----
+
The CLI displays the messages. Observe that because you didn't specify an offset value, the CLI displays *all* of the messages on partition `0`.

. Consume the third message that you produced by specifying the appropriate partition and offset values.
+
[source]
----
rhoas kafka topic consume --name=test-topic --partition=0 --offset=2
----

. Consume the fourth message, which you produced to partition `1` of the topic.
+
[source]
----
rhoas kafka topic consume --name=test-topic --partition=1
----

.Verification
* Does running the commands output messages on the correct partitions?
* When you entered the `kafka topic consume` command, did you see the expected messages?

ifdef::qs[]
[#conclusion]
====
Congratulations! You successfully completed producing and conuming messages in Rhoas Cli for {product-long-kafka} quick start.
Congratulations! You successfully completed the quick start for producing and consuming messages using the `rhoas` CLI.
====
endif::[]

Expand Down
21 changes: 11 additions & 10 deletions docs/rhoas/rhoas-produce-consume/quickstart.yml
Original file line number Diff line number Diff line change
@@ -1,25 +1,26 @@
apiVersion: console.openshift.io/v1
kind: QuickStarts
metadata:
name: Produce and consume in Rhoas Cli
name: producing-consuming-cli
annotations:
draft: true
order: 6
draft: false
order: 3
spec:
version: 0.1
type:
text: Quick Start
color: green
displayName: !snippet/title README.adoc#chap-produce-consume-rhoas-cli
durationMinutes: 20
displayName: !snippet/title README.adoc#chap-producing-consuming-rhoas-cli
durationMinutes: 10
icon: data:image/svg+xml;base64,PHN2ZyB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciIHdpZHRoPSIzOCIgaGVpZ2h0PSIzOCIgdmlld0JveD0iMCAwIDM4IDM4Ij48ZGVmcz48c3R5bGU+LmF7ZmlsbDojZmZmO30uYntmaWxsOiNlMDA7fTwvc3R5bGU+PC9kZWZzPjxwYXRoIGQ9Ik0yOCwxSDEwYTksOSwwLDAsMC05LDlWMjhhOSw5LDAsMCwwLDksOUgyOGE5LDksMCwwLDAsOS05VjEwYTksOSwwLDAsMC05LTlaIi8+PHBhdGggY2xhc3M9ImEiIGQ9Ik0yMiwyNS42MjVIMTNhLjYyNS42MjUsMCwwLDEsMC0xLjI1aDlhMi4zNzUsMi4zNzUsMCwwLDAsMC00Ljc1SDE1YTMuNjI1LDMuNjI1LDAsMCwxLDAtNy4yNUgyNWEuNjI1LjYyNSwwLDAsMSwwLDEuMjVIMTVhMi4zNzUsMi4zNzUsMCwwLDAsMCw0Ljc1aDdhMy42MjUsMy42MjUsMCwwLDEsMCw3LjI1WiIvPjxwYXRoIGNsYXNzPSJiIiBkPSJNMjUsMTYuNjI1QTMuNjI1LDMuNjI1LDAsMSwxLDI4LjYyNSwxMywzLjYyODg2LDMuNjI4ODYsMCwwLDEsMjUsMTYuNjI1Wm0wLTZBMi4zNzUsMi4zNzUsMCwxLDAsMjcuMzc1LDEzLDIuMzc3NywyLjM3NzcsMCwwLDAsMjUsMTAuNjI1WiIvPjxwYXRoIGNsYXNzPSJiIiBkPSJNMTMsMjguNjI1QTMuNjI1LDMuNjI1LDAsMSwxLDE2LjYyNSwyNSwzLjYyODg2LDMuNjI4ODYsMCwwLDEsMTMsMjguNjI1Wm0wLTZBMi4zNzUsMi4zNzUsMCwxLDAsMTUuMzc1LDI1LDIuMzc3NywyLjM3NzcsMCwwLDAsMTMsMjIuNjI1WiIvPjwvc3ZnPg==
description: !snippet README.adoc#description
prerequisites:
- A Red Hat identity
- You've created a Kafka instance and the instance is in `Ready` state. See the _Getting started with OpenShift Streams for Apache Kafka_ quick start.
- You have a Red Hat identity
- You've created a Kafka instance and the instance is in the `Ready` state. See the _Getting started with OpenShift Streams for Apache Kafka_ quick start.
- You've installed the latest version of the rhoas CLI. See <a href="https://access.redhat.com/documentation/en-us/red_hat_openshift_application_services/1/guide/bb30ee92-9e0a-4fd6-a67f-aed8910d7da3">Installing and configuring the rhoas CLI</a>.
introduction: !snippet README.adoc#introduction
tasks:
- !snippet/proc README.adoc#proc-configure-kafka-instance
- !snippet/proc README.adoc#proc-produce-message
- !snippet/proc README.adoc#proc-consume-message
- !snippet/proc README.adoc#proc-creating-kafka-topic-for-cli-production-consumption
- !snippet/proc README.adoc#proc-producing-messages
- !snippet/proc README.adoc#proc-consuming-messages
conclusion: !snippet README.adoc#conclusion

0 comments on commit 2f4f98d

Please sign in to comment.