Kafka Plugin for Log Shipper

Kafka Plugin for Log Shipper

This document explains how to ingest Netskope Alerts, Events, and web transaction logs in CEF and JSON format from the Netskope Tenant into Kafka using Cloud Exchange via the Log Shipper Kafka v1.0.0 plugin.

The Log Shipper Kafka plugin is used to ingest, transform and ingest, the alerts, events, and WebTx logs to the Kafka topic on the Kafka server/cluster. The plugin will act as a producer to publish the message to the Kafka topic.

Prerequisites

To complete this configuration, you need:

  • A Netskope tenant (or multiple, for example, production and development/test instances) that is already configured in Cloud Exchange.
  • A Netskope Cloud Exchange tenant with the Log Shipper module already configured.
  • Your Kafka server configuration parameters.
  • Connectivity to the Kafka server.
Compatibility

Netskope CE: v4.1.0, and v4.2.0

Performance Matrix

This performance reading is for a Large Stack CE with has below mentioned VM specifications.

Stack Size

Large

RAM: 32 GB

Core: 16

Alerts/Events

~ 6 MBps

WebTx

~ 6 MBps

Kafka Plugin Support

Event Support

Yes

Alert Support

Yes

WebTx Support

Yes

All Netskope events, alert logs, and web transaction logs will be shared.

Workflow

  1. Get your Kafka configuration parameters.
  2. Configure the Kafka plugin.
  3. Configure Log Shipper Business Rules for Kafka.
  4. Configure Log Shipper SIEM Mappings for Kafka.
  5. Validate the Kafka plugin.

Click play to watch a video.

 

Get your Kafka Configuration Parameters

The following configuration parameters are needed to configure the Kafka plugin for Netskope CLS. Reach out to the Kafka server configuration team in order to get this information.

  1. Kafka Broker Address: DNS/IP Address/FQDN of Kafka broker to which data will be sent. Note: Plugin just needs one broker that will respond to Metadata API requests.
  2. Kafka Port: Kafka Port address to which broker is configured.
  3. Kafka Security Protocol: Select the security protocol using which authentication will be performed and data will be sent to the Kafka cluster.
  4. Kafka CA Certificate: Kafka CA Certificate in PEM format. Note: This configuration parameter is only applicable when SSL is selected as Kafka Security Protocol.
  5. Kafka Client Certificate: Kafka Client Certificate in PEM format. Note: This configuration parameter is only applicable when SSL is selected as Kafka Security Protocol.
  6. Kafka Client Private Key: Kafka Client Private Key in PEM format. Note: This configuration parameter is only applicable when SSL is selected as Kafka Security Protocol.
  7. Kafka SSL Private Key Password: The password that is used while loading the certificate. Note: This configuration parameter only applies when SSL is selected as Kafka Security Protocol. It is only needed when the PEM file is generated without a passphrase.
  8. Kafka Topic Name: Kafka Topic Name to which the logs should be sent. Note: Kafka Topic Name should not have any spaces in it.

Configure the Kafka Plugin

  1. Go to Settings > Plugins. Search for and select the Kafka CLS plugin box.
  2. Add the plugin configuration name and Select Mapping. Click Next. Disable the toggle button ‘Transform the raw logs’ if you want to ingest your alerts and events in Raw JSON format.
  3. Click Next and enter these parameters:
    1. Kafka Broker Address
    2. Kafka Port
    3. Kafka Security Protocol.
    4. Kafka CA Certificate
    5. Kafka Client Certificate
    6. Kafka Client Private Key
    7. Kafka SSL Private Key Password
    8. Kafka Topic Name
    9. Log Source Identifier

    Note

    All above mentioned parameters are needed when Security Protovol is SSL. If the security Protocol is Plaintext, add Kafka Broker Address, Kafka Port and Topic Name.

  4. Click Save. The new plugin configuration will be available on the Log Shipper > Plugins page.

Configure a Log Shipper Business Rule for Kafka

  1. Go to the Business rule page, by default we have a business rule that filters all alerts and events. If you need to filter out any specific type of alert or event click Create New Rule to configure a new business rule by adding the rule name and filter.
  2. Click Save.

Configure Log Shipper SIEM Mappings for Kafka

  1. Go to the SIEM Mapping page and click on Add SIEM Mapping page.
  2. Select the Source plugin(Netskope CLS plugin), Destination plugin(Kafka plugin), and business rule and click on save.
  3. For ingestion of WebTransaction, Select Netskope WebTx plugin in Source and Kafka plugin in destination and click Save.

Validate the Kafka Plugin

Validate in Cloud Exchange

In order to validate the plugin workflow, go to Log Shipper in Cloud Exchange.

  1. Click Logging.
  2. Search for ingested alerts with the filter “message contains ingested”.
  3. The ingested logs will be filtered.

Validate on the Kafka server

The Kafka plugin was designed to send the CEF formatted data by encoding it to UTF-8, and JSON events by performing JSON serialization using json.dumps(), and then encoding it to UTF-8.

There are many ways to validate that the data is sent to the Kafka server, but here we have used the offset explorer to validate it.

Troubleshooting the Kafka Plugin

Topic is created automatically in Kafka while ingesting data, even when it was deleted.

What can be done?

  1. Add a new topic in Kafka.
  2. Update the Kafka plugin with new topic name and save it.
  3. Delete the old topic on Kafka.

Following these steps will not create the deleted topic again on Kafka and data will be ingested to the newly added topic.

Receiving “Kafka Broker is unreachable or Kafka cluster might be down. Verify Kafka Broker Address and Kafka Port provided in configuration parameters. Error: NoBrokersAvailable”

This issue might occur due to one of the following reasons:

  1. The Kafka server is actually down.
  2. The disk space for the server is full

What can be done?

  1. Reach out to you IT and confirm which of the above reason causes the error.
    1. If Kafka is down restarting it will work.
    2. If disk space is full either clear the unwanted files and empty the space or get more disk space and restart the server.

Share this Doc

Kafka Plugin for Log Shipper

Or copy link

In this topic ...