-
Notifications
You must be signed in to change notification settings - Fork 38
Splunk Disclaimer for Log Streaming Instructions #22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -81,9 +81,7 @@ Download the `connect-distributed.properties` [here](https://github.com/IBM/clou | |
| - Modify the `plugin.path` in the file contents to point to the directory `connector` we created in the previous step. | ||
| - Now copy the file `connect-distributed.properties` to the `config` folder under the `kafka_2.13-2.5.0` folder. | ||
|
|
||
| ### Run Kafka connect | ||
|
|
||
| Open a terminal. Run the below commands. The [base dir] is the directory under which we created the folder `kafka_splunk_integration`. | ||
| ### Run Kafka connectOpen a terminal. Run the below commands. The [base dir] is the directory under which we created the folder `kafka_splunk_integration`. | ||
|
|
||
| ``` | ||
| $ export KAFKA_HOME=[base dir]/kafka_splunk_integration/kafka_2.13-2.5.0 | ||
|
|
@@ -92,6 +90,17 @@ $ $KAFKA_HOME/bin/connect-distributed.sh $KAFKA_HOME/config/connect-distributed. | |
|
|
||
| ### Install Splunk | ||
|
|
||
| Typically, those interested in this functionality already have Splunk implemented in their environment and have the technical expertise to configure it with the Kafka messaging services. | ||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I'd delete this entire statement, even if the first statement is true, I think that the second one is really the point of this guide. Users that already know how to publish events in a Kafka topic to Splunk would have stopped after they had set up log streaming from IBM Log Analysis or Activity Tracker to Event Streams. |
||
|
|
||
| The following instructions describe the steps necessary to create a Splunk instance that is deployed on a workstation using a Docker container. Additionally, configuring the Splunk system for communication with the IBM Event Streams through Kafka is demonstrated. | ||
|
|
||
| Check the following links to ensure that you have the latest installation pre-requisites satisfied for the platform being used. | ||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think it can be more clear overall when pre-requisites should be addressed at the top of the document, this helps the user to avoid surprises half-way through the activity. I noticed also at the top of the document (and throughout) that there is still language like LogDNA. |
||
|
|
||
| Docker:https://docs.docker.com/engine/install/binaries/ | ||
|
|
||
| Splunk:https://docs.splunk.com/Documentation/Splunk/8.2.2/Installation/Beforeyouinstall | ||
|
|
||
|
|
||
| We will install Splunk inside a container here. | ||
| Open a new terminal. Run the below commands. | ||
| ``` | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This has a markdown formatting error. The heading
Run Kafka Connectis immediately followed by text instructions. There needs to be an empty line between a markdown heading and following text.