Kafka Plugin for Log Shipper
Kafka Plugin for Log Shipper
This document explains how to ingest Netskope Alerts, Events, and web transaction logs in CEF and JSON format from the Netskope Tenant into Kafka using Cloud Exchange via the Log Shipper Kafka v1.0.0 plugin.
The Log Shipper Kafka plugin is used to ingest, transform and ingest, the alerts, events, and WebTx logs to the Kafka topic on the Kafka server/cluster. The plugin will act as a producer to publish the message to the Kafka topic.
Prerequisites
To complete this configuration, you need:
- A Netskope tenant (or multiple, for example, production and development/test instances) that is already configured in Cloud Exchange.
- A Netskope Cloud Exchange tenant with the Log Shipper module already configured.
- Your Kafka server configuration parameters.
- Connectivity to the Kafka server.
Compatibility
Netskope CE: v4.1.0, and v4.2.0
Performance Matrix
This performance reading is for a Large Stack CE with has below mentioned VM specifications.
Stack Size | Large RAM: 32 GB Core: 16 |
Alerts/Events | ~ 6 MBps |
WebTx | ~ 6 MBps |
Kafka Plugin Support
Event Support | Yes |
Alert Support | Yes |
WebTx Support | Yes |
All Netskope events, alert logs, and web transaction logs will be shared.
Workflow
- Get your Kafka configuration parameters.
- Configure the Kafka plugin.
- Configure Log Shipper Business Rules for Kafka.
- Configure Log Shipper SIEM Mappings for Kafka.
- Validate the Kafka plugin.
Click play to watch a video.
Get your Kafka Configuration Parameters
The following configuration parameters are needed to configure the Kafka plugin for Netskope CLS. Reach out to the Kafka server configuration team in order to get this information.
- Kafka Broker Address: DNS/IP Address/FQDN of Kafka broker to which data will be sent. Note: Plugin just needs one broker that will respond to Metadata API requests.
- Kafka Port: Kafka Port address to which broker is configured.
- Kafka Security Protocol: Select the security protocol using which authentication will be performed and data will be sent to the Kafka cluster.
- Kafka CA Certificate: Kafka CA Certificate in PEM format. Note: This configuration parameter is only applicable when SSL is selected as Kafka Security Protocol.
- Kafka Client Certificate: Kafka Client Certificate in PEM format. Note: This configuration parameter is only applicable when SSL is selected as Kafka Security Protocol.
- Kafka Client Private Key: Kafka Client Private Key in PEM format. Note: This configuration parameter is only applicable when SSL is selected as Kafka Security Protocol.
- Kafka SSL Private Key Password: The password that is used while loading the certificate. Note: This configuration parameter only applies when SSL is selected as Kafka Security Protocol. It is only needed when the PEM file is generated without a passphrase.
- Kafka Topic Name: Kafka Topic Name to which the logs should be sent. Note: Kafka Topic Name should not have any spaces in it.
Configure the Kafka Plugin
- Go to Settings > Plugins. Search for and select the Kafka CLS plugin box.
- Add the plugin configuration name and Select Mapping. Click Next. Disable the toggle button ‘Transform the raw logs’ if you want to ingest your alerts and events in Raw JSON format.
- Click Next and enter these parameters:
- Kafka Broker Address
- Kafka Port
- Kafka Security Protocol.
- Kafka CA Certificate
- Kafka Client Certificate
- Kafka Client Private Key
- Kafka SSL Private Key Password
- Kafka Topic Name
- Log Source Identifier
- Click Save. The new plugin configuration will be available on the Log Shipper > Plugins page.
Note
All above mentioned parameters are needed when Security Protovol is SSL. If the security Protocol is Plaintext, add Kafka Broker Address, Kafka Port and Topic Name.
Configure a Log Shipper Business Rule for Kafka
- Go to the Business rule page, by default we have a business rule that filters all alerts and events. If you need to filter out any specific type of alert or event click Create New Rule to configure a new business rule by adding the rule name and filter.
- Click Save.
Configure Log Shipper SIEM Mappings for Kafka
- Go to the SIEM Mapping page and click on Add SIEM Mapping page.
- Select the Source plugin(Netskope CLS plugin), Destination plugin(Kafka plugin), and business rule and click on save.
- For ingestion of WebTransaction, Select Netskope WebTx plugin in Source and Kafka plugin in destination and click Save.
Validate the Kafka Plugin
Validate in Cloud Exchange
In order to validate the plugin workflow, go to Log Shipper in Cloud Exchange.
- Click Logging.
- Search for ingested alerts with the filter “message contains ingested”.
- The ingested logs will be filtered.
Validate on the Kafka server
The Kafka plugin was designed to send the CEF formatted data by encoding it to UTF-8, and JSON events by performing JSON serialization using json.dumps(), and then encoding it to UTF-8.
There are many ways to validate that the data is sent to the Kafka server, but here we have used the offset explorer to validate it.
Troubleshooting the Kafka Plugin
Topic is created automatically in Kafka while ingesting data, even when it was deleted.
What can be done?
- Add a new topic in Kafka.
- Update the Kafka plugin with new topic name and save it.
- Delete the old topic on Kafka.
Following these steps will not create the deleted topic again on Kafka and data will be ingested to the newly added topic.
Receiving “Kafka Broker is unreachable or Kafka cluster might be down. Verify Kafka Broker Address and Kafka Port provided in configuration parameters. Error: NoBrokersAvailable”
This issue might occur due to one of the following reasons:
- The Kafka server is actually down.
- The disk space for the server is full
What can be done?
- Reach out to you IT and confirm which of the above reason causes the error.
- If Kafka is down restarting it will work.
- If disk space is full either clear the unwanted files and empty the space or get more disk space and restart the server.