Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -81,9 +81,7 @@ Download the `connect-distributed.properties` [here](https://github.com/IBM/clou
- Modify the `plugin.path` in the file contents to point to the directory `connector` we created in the previous step.
- Now copy the file `connect-distributed.properties` to the `config` folder under the `kafka_2.13-2.5.0` folder.

### Run Kafka connect

Open a terminal. Run the below commands. The [base dir] is the directory under which we created the folder `kafka_splunk_integration`.
### Run Kafka connectOpen a terminal. Run the below commands. The [base dir] is the directory under which we created the folder `kafka_splunk_integration`.
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This has a markdown formatting error. The heading Run Kafka Connect is immediately followed by text instructions. There needs to be an empty line between a markdown heading and following text.


```
$ export KAFKA_HOME=[base dir]/kafka_splunk_integration/kafka_2.13-2.5.0
Expand All @@ -92,6 +90,17 @@ $ $KAFKA_HOME/bin/connect-distributed.sh $KAFKA_HOME/config/connect-distributed.

### Install Splunk

Typically, those interested in this functionality already have Splunk implemented in their environment and have the technical expertise to configure it with the Kafka messaging services.
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd delete this entire statement, even if the first statement is true, I think that the second one is really the point of this guide. Users that already know how to publish events in a Kafka topic to Splunk would have stopped after they had set up log streaming from IBM Log Analysis or Activity Tracker to Event Streams.


The following instructions describe the steps necessary to create a Splunk instance that is deployed on a workstation using a Docker container. Additionally, configuring the Splunk system for communication with the IBM Event Streams through Kafka is demonstrated.

Check the following links to ensure that you have the latest installation pre-requisites satisfied for the platform being used.
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it can be more clear overall when pre-requisites should be addressed at the top of the document, this helps the user to avoid surprises half-way through the activity. I noticed also at the top of the document (and throughout) that there is still language like LogDNA.


Docker:https://docs.docker.com/engine/install/binaries/

Splunk:https://docs.splunk.com/Documentation/Splunk/8.2.2/Installation/Beforeyouinstall


We will install Splunk inside a container here.
Open a new terminal. Run the below commands.
```
Expand Down
1 change: 1 addition & 0 deletions website/src/pages/log-streaming/content-overview/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -118,6 +118,7 @@ LogDNA **Streaming** is active, similar to the screen shown below.

**The Consumer is now consuming an event log topic, that LogDNA Streaming is sending to the IBM Cloud Event Streams message bus**.

## djb Test
</InlineNotification>


Expand Down